Pages

Tuesday, November 22, 2011

Doom 3 id Tech 4 Game Engine Source Code Released

It is awesome to see the Doom 3 id Tech 4 Game Engine source code released and available at https://github.com/TTimo/doomhttp://www.blogger.com/img/blank.gif3.gpl. This does not include the Doom 3 assets, but is awesome to see all the code on GitHub.

Looking at the source code, a few similarities show up with other engines.

Curl for HTTP requests (common in a few engines).
OpenAL for audio (nice to see, also fairly common).
Ogg Vorbis format for sounds (used in a few places).
DirectX SDK requirement (for Windows).
JPEG (version 6) for images.

Some other notes at a glance:

Custom UI code.
MayaImport tools.
Tools for dialogs, material editors, particles and more.

It is awesome to see this, and many thanks to John Carmack and the id Tech team for sharing. I will definitely be looking at this further and will be anxious to see what the gaming community mods and comes up with, using this code.

I will be certain to be reading more of the code which looks in general to be of quite a high quality. There are not a lot of comments (which would be nice to have) but overall the functions and classes are clearely named and look to be a pleasure to read. Also, in a number of the external libraries are the custom documentaion, like Curl, OpenAL etc. so that is nice to see how those APIs are being used, and what else can be extended.

Lots of fun, I have been busy recently, but will certainly have to devote some time to this.

Cheers,
Michael Hubbard
http://michaelhubbard.ca

Saturday, November 12, 2011

Adobe not supporting Flash on Android and mobile devices.

It is always interesting to see how some of the larger companies move. In this article http://www.businessinsider.com/adobe-engineer-heres-why-we-killed-flash-for-mobile-2011-11 it shows that Adobe is not going to focus on supporting Flash for mobile devices. This was a hot topic between Apple and Adobe, with the iPhone not supporting Flash, but it looks like Adobe is deciding to agree.

With things like Googles Swiffy http://www.google.com/doubleclick/studio/swiffy/ which converts SWF to HTML5, it is good to see Adobe is jumping onto that and not letting Google runaway with what is still (I believe) their proprietary format. While I have not spent a lot of time working with Flash, it is still interesting to think that HTML5 will be an exact replacement for it. I think when people suggest this, they are looking at the end product, rather then the tool that Flash is (a timeline, with various tweens, staging etc). HTML5 may be able to do everything that Flash does (in terms of visuals), but the tools to create the content is what made people adopt Flash over something else (like Java Applets, or Javascript and animated .gifs).

I think Flash will still be around in some form, maybe it will create HTML5 instead of swfs, or maybe it will generate something else... time will tell if something can fill that 2D content that is relatively fast to implement on the web, but I am sure both Adobe, Google, Apple (and Microsoft) will all be playing close attention.

I am sure by the iPhone 6 there will still be something else important that separates the mobile devices :P

Cheers,
Michael Hubbard
http://michaelhubbard.ca

Saturday, November 5, 2011

Book Review: Head First Design Patterns

Head First Design Patterns by Eric and Elisabeth Freeman is a fun read. I always was curious about what kind of content some of these books by O'Reilly that seemed so different would be like (lots of pictures, jokes, crossword puzzles etc.) but it turned out to be both entertaining and informative. The book really does try to tackle a number of important issues, and doing an "arm chair discussion" where two design patterns talk (and argue) about their differences, is both funny and some of the most useful insights into distinguishing patterns I have seen.

The Freemans really do create a useful intro to design patterns, with a number of entertaining examples that will make you enjoy reading each page, and is certainly a breath of fresh air for anyone used to very academic and "serious" books. I felt the content was great, and the format was excellent for an introduction book.

While I don't normally go for books based on Java, this was more of a theory book, and while the source code examples are in Java, it should not be difficult for anyone to follow along (outside of maybe some of the Remote Proxy examples which leverage Java's Remote Method Invocation).

The main patterns covered include: Strategy, Observer, Decorator, Factory, Singleton, Command, Adapter, Facade, Template, Iterator, Composite, State, Proxy and Compound Patterns. There is also an appendix that contains a number of other patterns. I would have prefered to have seen a chapter on Flyweight or Builder instead of Singleton, but whether or not Singleton is an Anti-Pattern is still up for debate.

One of the most interesting approaches in the book is to break apart a Model-View-Controller paradigm into separate patterns. The Model becomes and Observer pattern (allowing state information to be passed back to the View or the Controller (allowing it to be independent of the View and Controller). The View becomes a Composite (likely a collection of widgets or something similar). The Controller implements the Strategy pattern, which configures the View with a strategy and changes the state of the Model as necessary. This explanation fits well (especially the Model using an Observer) which I think is one of the nicest ways to implement the MVC design.

While I have read a few books on Design Patterns already, I think as an intro, unless you are looking for a specific language implementation or the original Design Patterns book by the Group of Four, I think this is a good intro and an overall fun read.

Bye for now,
Michael Hubbard
http://michaelhubbard.ca

Thursday, November 3, 2011

Character Creation: How Much Customization?

In the new game trailer for Soul Calibur 5 http://www.gametrailers.com/video/character-creation-trailer/723519 they show the awesome character customization ability. This is quite well done, and while not exactly the same as something like Spore, where appendages and weight changes, it is nevertheless still quite impressive.

While it is amazing technological work for a two player game, currently this type of extreme customization wouldn't work in an MMORPG. Not to say you couldn't have customization in an MMORPG, it happens all the time, but even then 100s or 1000s of characters in a town would likely bring even the best gaming computers to a crawl as every texture and separate geometry tried to get rendered. Sure, there are tricks that can be done (different level of detail, texture resoultion sizes, combining as many pieces into a single draw call, replacing similar pieces into one piece, etc.) most of these however have the distinct chance of making all the other characters look similar and simplified (which is not what extreme customization is about).

There will likely be a time when every character, tree, rock, brick, stone and even blade of grass is unique, but until then, one of the main problems I have seen with game designers (and developers) going down the customization path is knowing where to draw the line. It is very important (crucial) to establish how much customization can be allowed by the player and whether it can be supported. If you want to go the Second Life route, you will have to make sure the engine was designed with that specifically in mind. If you find yourself worrying about how to render each character's belts, earrings, shoelaces, cufflinks, ties and shirt pins you have likely gone too far (perhaps even off the deep end).

Computer games (and graphics) are mostly a series of tricks, but you would need a whopper to trick the graphics card and cpu into rendering outrageous customizations in an MMORPG without it running like a screen saver. Still, the Soul Calibur customization sure is pretty, and it will only be a matter of time before someone tries to do that with 100s of characters (hopefully the computers in the future will be that much better).

Best of luck,
Michael Hubbard
http://michaelhubbard.ca

Saturday, October 15, 2011

Book Review: More Effective C++

A blast from the past More Effective C++ by Scott Meyers is not just another C++ book, but touches on a number of features and concepts that should be talked about far more than you get from most C++ books.

Some of the topics include: smart pointers, lazy evaluation, preventing resource leaks in constructors, preventing exceptions from leaving destructors, understanding the cost of exception handling, understanding the origin of temporary objects, understanding the cost of virtual functions, base classes and RTTI and more.

One part I especially liked was avoiding gratuitous default constructors. This is more of a design decision, but still a useful one, in that if member functions have to test if fields have truly been initialized it complicates the design and can lead to bugs and more complicated behaviour in handling these uninitialized classes.

Meyers talks about a number of design decisions, and side effects of going too far with overloading methods, or operator new or operator delete (and the numerous hacks and design decisions that have to be considered if you do decide to handle your own new and delete (i.e. don't do it)).

The chapter on exceptions was particularly intresting, and is probably my favorite description of exceptions:
The addition of exceptions to C++ changes things. Profoundly. Radically. Possibly uncomfortably. The use of raw, unadorned poniters, for example, becomes risky. Opportunities for resource leaks increase in number. It becomes more difficult to write constructors and destructors that behave the way we want them to. Special care must be taken to prevent program execution from abruptly halting. Executables and libraries typically increase in size and decrease in speed... so why mess with exceptions, especially if they are problematic as I say? The answer is simple: exceptions cannot be ignored (pg 44).

When thinking about exceptions in this way (as something that cannot be ignored), it is a great mindset for the programmer to consider what they really consider unignorable. This is quite a bit different than return codes, or status variables, which can be ignored, although at the risk of potentially undefined behaviour. Exceptions are best used when something exceptional happens. For performance reasons, it is best to limit the usage of exceptions whenever appropriate (and exceptional).

There are also some good practical advice, such as program in the future tense. Consider how the code will grow in the future, or how it will be used. Provide complete classes, even if some parts aren't currently used. Make the class easy to use, and hard to use incorrectly. Generalize the code whenever appropriate to maximize reuse. This is not to say design and program things you aren't going to need, but rather create logically complete classes (if something can be set, it probably should also have a get, and if an entry can be added, it probably needs a way to remove that entry too).

The book even though it is not hot off the presses, still offers quite a bit of useful insight into C++ and fills a role that few books really delve into. If corners of your C++ knowledge are growing some cobwebs, this is a useful book to help dust some of them off.

Cheers,
Michael Hubbard
http://michaelhubbard.ca

Saturday, September 10, 2011

Book Review: Cross-Platform Development in C++

I decided to look into some more books on cross-platform work to see what more professionals were doing in this field. The book Cross-Platform Development in C++:Building Mac OS X, Linx and Windows Applications by Syd Logan, describes some of the techniques used in creating the Mozilla/Netscape browser to multiple platforms, and the challenges involved in creating such a massive project.

The basic theme of the book is abstraction, especially when dealing with low level implementation details, file I/O, threads, byte order, any platform specific code etc. Logan recommends creating factory classes that returns a concrete class implementation of an interface for these types of problems. The code should NOT be sprinkled with #if defines as it makes coding and debugging these things a nightmare as nearly every line could contain one or more #defines. Instead creating an abstract factory for returning the necessary factory and concrete implementation is essential.

In this book Logan share tips and workflow processes that allowed these projects to succeed on multiple platforms. These included having multiple experts on each of the platforms for handling, requiring developers to test on all three major platforms (Windows, Mac OS and the target flavour of Linux) before commiting to the code base. Logan also suggests another requirement is to compile with different compilers, pay attention to warnings and consider whether the native compilers for each operating system are the best approach (whichever compiler makes the most sense). One scheduling requirement was that all functionality worked on all platforms at the same time, this was essential to keeping everything working and preventing features from only being implemented on a given platform (which goes against the application being entirely cross-platform).

The book goes into quite a bit of detail on setting up Makefiles for each of the platforms using Imake and having a cross platform make system. It is also suggested to use a cross platform bug reporting system (Logan uses Bugzilla and Tinderbox since they are other Netscape/Mozilla projects). CVS is used (although Subversion is mentioned) but it is also a requirement to have cross platform tools (while also relying on Cygwin to emulate a lot of Linux tools on Windows).

There are other tips such as consider using the NSPR platform abstraction library (Netscape Portable Runtime Library), being wary of what standards floating point calculations are done, avoiding serialization of binary data (unlss you are using a convention for the data), using the limits.h for any size information (instead of using sizeof), etc.

One particularly good example in the book was related to the floating point representation:
float x = 4.0/5.0;
if (x == 4.0/5.0)
{
printf("Same\n");
}
else
{
printf("Different\n");
}
The above example will print out "Different", since floating point expressions are evaluated to the greatest precision representation available (in this case double). The example below will print out "Same".
double x = 4.0/5.0;
if (x == 4.0/5.0)
{
printf("Same\n");
}
else
{
printf("Different\n");
}
The book deals a lot with user interfaces as well (which are often a major issue with portability). Logan goes into some detail of the MVC (Model-View-Controller) paradigm, but also covers wxWidgets http://www.wxwidgets.org/, some of Netscape/Mozilla's XML-based GUI toolkit XUL https://developer.mozilla.org/En/XUL and his own open source project Trixul http://www.trixul.com/.

Trixul is an interesting project and some of the functionality of a browser (like embedded Javascript to interact with the DOM) are concepts that would be interesting to apply to a game engine, although perhaps in a slightly different way. Serializing in XML for the UI is incredibly valuable, although likely something that individuals will have to write themselves (see my post on XNA Serialization for this). Trixul allows C++ components to be written that can be accessed through a Javascript component manager to extend the functionality, within the SpiderMonkey Javascript engine. I won't go into too much detail about Trixul here, but if you are interested check out the links (and the book).

All in all, I think this book gives a great insight into the complexities of cross-platform development, especially in working in a large team. I still stand by my post about indie game developers should not initially target multiple platforms: http://gameprogrammertechnicalartist.blogspot.com/2011/08/indie-multi-platform-game-development.html as the amount of effort involved in making this work, the number of experts involved and the time it takes to get just three major platforms to work can be significant. If possible try to use tools that are platform independent, and see about abstracting all platform specific code, test on all systems as you write the code, and realize this is going to take some time.

Best of luck,
Michael Hubbard
http://michaelhubbard.ca

Saturday, September 3, 2011

Coffee Break Hero 14 Day Sprint.

This is cool, a complete RPG in 14 days, with 24 hour live streaming a day, to raise money for Child's Play Charity: http://www.bigblockgames.com/games/coffeehero/challenge/

The live streams are at: http://www.twitch.tv/bigblockgames
More info on the charity is here: http://www.childsplaycharity.org/

Looks like this is shaping up quickly. They are only a few days in, but it will be interesting to see the end results.

This is a great idea for charity, and if you are reading this I'm sure they would also appreciate any donations for their game.

I look forward to seeing their end results, and hope they raise a lot.

Best of luck guys,
Michael Hubbard
http://michaelhubbard.ca

Sunday, August 28, 2011

Book Review: Write Portable Code

There are not a lot of books on portable code, but Write Portable Code: An Introduction to Developing Software for Multiple Platforms by Brian Hook does a good job handling many of the topics.

There are a number of rules for portability:

- Never assume anything (memory, size of built in types (int, float etc.)).
- The code will likely have to have non-portable elements to run efficiently.
- Establish a reasonable baseline of platforms, (not both PS3 and Commodore 64).
- Never read or write structures from memory or cast raw bytes to a structure.
- Always convert to or from a canonical format when moving data in or out of memory.
- Develop good habits and use tools and platforms that cooperate to strong practices.
- Avoid new langages or library features.
- Integrate testing.
- Use compile time assertions and strict compilations (avoid excessive conditional compilations).
- Write straightforward code.
- Understand anything can change between compilers (floating point can work different).
- Leverage portable third party libraries, but be careful.
- Performance and resource usage must be as portable as your features.
- Portability also means supportin other cultures, regions and languages.
- Consider using a language more suited to the task (python, C#, etc. are easier than C++).
- Systems are becoming more secure (folders, ports etc.).

Brian Hook also has a POSH (Portable Open Source Harness) at http://hookatooka.com/poshlib/ that demonstrates some examples of portability and a good stepping stone for those looking at porting C++.

I think the book was useful it demonstrates a number of source code examples for his Simple Audio Library (SAL). Brian Hook also goes through a number of examples of the different standards including the IEEE 754, C8 and C99 standards, which are interesting to look at from a portability focus, and it would great if more books looked at the standards in their work.

Bye for now,
Michael Hubbard
http://michaelhubbard.ca

Thursday, August 18, 2011

Indie multi-platform game development

Often indie developers get caught up in following the "big boys" and trying to do as they do. With a large company, there is a lot of focus on releasing on multiple platforms, in multiple languages, on their first release, and to create a big splash. My advice is: don't do it.

Focus on one platform, make the best possible game and go from there. If you are using a third party engine that already does that for you fine, but if you have to pay extra licenses, and do a lot of extra development, you could very well be wasting your time. If you do not make enough money on your (first) target platform to support moving your game onto another platform, your game will not likely be a success anyway.

Look at Angry Birds and Plants vs Zombies, two of the most successful smart phone games of all time. Both originally only coming out for limited platforms first (just iOS for Angry Birds, PC/Mac for PvZ). Neither of these companies tackled all the platforms that they are now running on, and are likely only ported to those platforms because of the success of the original.

It is worthwhile to focus on trying to write customizable code, and wrapper classes and functions for things that you know will could be or will be swapped out. Obvious things include the graphics API wrapper classes are essential if you need to support both OpenGL and DirectX or some other graphics API. It is also worth having a default functionality for things that could be a problem (like if a platform does not support pixel shading or a limited number of textures on the gpu, make sure your game does not rely on this too heavily).

If you really want your code to be portable, port early and port often. In a larger team I have seen it work where different developers run different operating systems and graphics cards in an attempt to guarantee that most of the functionality will work with a daily build. It is often easier to catch issues as they crop up, then to hunt for hours, days or weeks for all the differences that is not making it run on the other platforms. This introduces other workflow issues, and will sometimes require those developers that broke the build to access to other machines so they can fix it. This does help guarantee that the code written is more portable, and if it works for two very different architecture chance are more likely it will run on those inbetween. The problem for indie developers is they do not usually have this luxury of multiple machines, dedicated developers and perhaps this time.

If you believe that the market has shifted and you really should be focused on another device or platform, that is different, but chances are, your first assumptions about developing for that platform or device were well founded. If your game was not a real success on your ideal device or platform, chances are not very good you will see a huge difference in success if you spend the time to port to the other platforms. This can be frustrating, but instead, focus on improving your original game first, or making your next game even better. If you want to take a lesson from the console developers, consider this: if your game is really that good, people may even buy the platform to run it :P.

Best of luck,
Michael Hubbard
http://michaelhubbard.ca

Thursday, August 11, 2011

Siggraph 2011 Part 3

It is over! I thought I would add some closing thoughts.

I managed to check out the Blender display (back in the far corner) and they had some cool stuff going, but could not help thinking that the Autodesk slogan "Don't blend in, stand out" may have some additional marketing subtext. I like Blender, and with their latest updates they are the best open source 3D modeling package out there, and while they are a few steps behind all Maya can offer, free is hard to beat when you are selling something. It was just a though, I still prefer Maya and really like Autodesk in general, but it was just a curious thought that passed my mind.

Otherwise, I think that Siggraph is a great pool of knowledge, but next time I go, I think I will try and spend the majority of my time in the talks, rather than the exhibition and galleries. I learned quite a bit, but I wouldn't say my mind was blown or anything like that. I felt like the exhibition and gallery were focused more on beginner to intermediate information, which is great when showing off a new technology, feature or process, but there is only so much detail you can go into in those kinds of demonstrations before you start losing people. The more formal presentations would likely not be as interested in attracting a crowd as exploring the depth of the ideas, and this, I think, would be very appealing.

Overall, I am very happy to have attended, and saw some neat things, learned some new technologies, and generally made me excited to work in this industry.

Can't really ask for more than that,
Michael Hubbard
http://michaelhubbard.ca

Wednesday, August 10, 2011

Siggraph 2011 Vancouver Part 2

So, I went to the exhibition and some of the galleries and ballrooms today. Losts of neat stuff. If I was to sum it all up, it would be Motion Capture, 3D printers, Autodesk, Maya and NVIDIA. Those seemed to be the major players, although not surprising since they are also some of the more expensive technologies, so would do well to advertise.

Some of the talks I went to include:

World Creation in CryEngine: I am awaiting for the release of the free version of CryEngine 3 SDK which will likely be this month (I checked and no mention of it on their webpage yet). The demo was ok, a bit simplistic in talking about the interface, but looks impressive nevertheless.

Photoshop 3D texture map integration: This was interesting talking about how 3D height maps can be created (and tested in 3D) in Adobe Photoshop CS 5.5. an intersting talk, with some good examples and neat features I will have to try out.

Real World Camera Rig Creation: The focus of this talk was to improve the use of a camera in Maya. This was a little different, the basic concepts here was to try and build a crane, dolly or curve path rig and attach a camera to it, to mimic real life cameras. By setting up and constraining the camera in this way, it allowed for more traditional movements alongside the free flowing camera that is so easy to misuse in a 3D animation or game.

Zbrush: Creation of Venom and Carnage from Spiderman. This was just very impressive to see how talented the artists were in creating these characters from a simple head (sculpted and painted in less than an hour) and with amazing results. Just seeing how some of the artists work give ideas for how to improve your own workflow, but really nothing but lots of practice can get you to that skill level.

Adobe Premiere Pro Integration: Shows how Premiere Pro can integrate with After Effects and Encore by sharing the same project information can allow all the programs to interact and update in sequence automatically. This will allow things like a project to be open in After Effects and have an effect added to it (like rain or snow) which will automatically update the same project in Premiere so it can be viewed and editted as necessary. This encourages experimentation, and allows for quick results to be seen immediately.

ILM Transformers 3 Colossus: The stats on the Colossus Transformers (the giant worm robot) in Transformers 3 were very impressive. Over 16 million polygons, 13 separate pieces and was the equivalent to 2 and a half Devestators (from the previous movie). It took a machine with 12 cores and 48 gigs of RAM over 40 minutes to load the shot where Colossus is tearing apart the building, which has multiple layers of complexity and physics (with skin based building model) to get the shot of the building falling over.

Nvidia Parallel NSight: http://developer.nvidia.com/nvidia-parallel-nsight is a very impressive tool integration for Visual Studio, that breaks apart the scene into different draw calls and gives a lot of previously difficult information about the rendering process back to the developer. A tool that allows a developer to select a pixel onscreen and not only track it, but see what draw calls have gone into its creation is amazing. I will definitely be spending some time further investigating this tool, and since it is free, I recommend everyone interested in 3D graphics programming to check it out, it looks like it will be a new favorite to many developers.

Siggraph Dailies: This was a one-minute segment of many different studios, backgrounds and styles. It was nice to see only a brief but intersting clip and a short (sometimes only a few sentences) of the challenges in getting that shot, techniques used, or ideas behind the shot. Ovearll, there are (as expected) a lot of cheats that go into a shot to get the desired effect. One interesting technique used in Tangled (from Disney) was running simulations on the hair in reverse than running the frames backward.

Lots of fun, the exhibition was pretty neat, the art was interesting, the technology was cool and I learned some neat stuff.

Hope you also made it there,
Michael Hubbard
http://michaelhubbard.ca



http://www.3dtotal.com/siggraph_diaries/exhibitor_golaem.html

Monday, August 8, 2011

Siggraph 2011 Vancouver

Siggraph is in Vancouver. It is pretty neat to be hosting the 3D animation festival, and there are lots of people excited to go. Some of the Animation Film Festival is today.

Some of the standouts include:

Le Royaume: http://www.youtube.com/watch?v=y6ZmMjMdrqs
Coca Cola Siege: http://www.youtube.com/watch?v=Shvwd7VYpE0
Spots vs Stripes: http://www.youtube.com/user/spotsvstripes#p/search/0/Zh-s3auYdKo
Dreamgiver: http://tycarter.blogspot.com/
Hezarfen: http://www.youtube.com/user/Supinfocomgroup#p/a/u/2/5YzT_RdBUvs
Kia Soul this or that: http://www.youtube.com/watch?v=jWJ4jHZwUPo
Meet Buck: http://www.youtube.com/watch?v=9vt4fBtxWYY
New Digs: http://martinsenart.blogspot.com/
Rubika: http://blog.autourdeminuit.com/distribution/rubika/

There were of course the big boys, Transformers 3 is amazing visually, I watched it twice in theaters (once in 2D, once in 3D) because it was just so visually impressive, and of course some of the games, movies and animations that are more mainstream (or are part of movies) are awesome too. Check out the animation festival list here: http://www.siggraph.org/s2011/for_attendees/computer-animation-festival for all the details.

Cheers,
Michael Hubbard
http://michaelhubbard.ca

Tuesday, August 2, 2011

Book Review: Godel, Escher, Bach: An Eternal Golden Braid

Godel, Escher, Bach: An Eternal Golden Braid by Douglas Hofstadter, is not really a programming book, but is a worthwhile book for those interested in math, logic, intelligence, patterns and recursion. The book is one worthy of reflecting on, and is thick with examples and clever insights, thought provoking questions and enough content and puzzles to make it worthy of a re-read later in life.

The book is broken up by dialogues between chapters, of conversations between the Tortoise and Achilles (a homage to the philosopher Zeno) , in which the Tortoise is usually teaching Achilles some paradox or life lesson, and allows the reader to both enjoy the wit and wordplay that goes on between the characters, and also learn the lesson alongside Achilles. There are other characters introduced as the chapters go on, each one bringing something new, or introducing a new topic or idea (just wait for the Crab).

Those who get stuck on the title, the book is not about a comparison of math, art and music, but rather examines the strange loops, patterns and paradoxes that exist in the universe. You may have heard of one of the most famous quotes when estimating deadlines, "Hofstadter's Law: It always takes longer than you expect, even when you take into account Hofstadter's Law". The self-referencing nature, and recursive elements play a large theme of the book, and lead towards one of the main themes of the book, that of consciousness and self-awareness, existing from complex neurological mechanisms. The idea of some form of complexity built upon different forms of complexity, suggests the building blocks of conciousness stem from the underlying mechanics.

The main example of the theme of consciousnessis is Hofstadter's example of the ant hill. The ants themselves exist as individuals but are also part of a much larger network, were each individual ant becomes part of a larger collective "consciousness". The ant hill as a whole is able to adapt to conditions in ways that an individual ant could not, and allows for better survival of the entire group. The ant hill, in essence is a consciousness made up of a collection of other consciousness, in some ways similar to the human brain is made up of neurons which have simpler functionality than the brain as a whole.

Do not let the book size be too daunting, I think those who start reading the first couple chapters will likely want to see it through, it will take some time, as the book is large and thick with examples and thought provoking content, if you don't think you will like it now, come back to it in ten years and try it than.

Recursion is a wonderful thing,
Michael Hubbard
http://michaelhubbard.ca

Sunday, July 24, 2011

Book Review: Starting a Successful Business in Canada KIT

Disclaimer: Laws are constantly changing and this blog post makes no representations or warranties of the outcomes or results of using information in this post. The author of this post and source information will not assume any liability for any claims, losses or damages from the use of this information.

Sorry for that wordy disclaimer, but it would be a real shame if something that was meant to spark people's interest was used as anything else. The reason for this post is that something that I feel a lot of developers do not spend enough time with is learning parts of a business. If you are smart enough to to develop complex games, you should be able to know as much about a business as someone who sets up a hotdog stand (by the way I do like those guys a lot, and only meant as an example of the complexity of creating the final product).

The book Starting a Successful Business in Canada KIT 18th edition by Jack D. James. MBA, LLB is one of the first business type books I have read outside of a few business books for courses I took in university. James recommends developing a business for things you are passionate about as a business can quickly eat up a ton of your time. If you do not have an interest in what you are doing, you are less likely to continue and pursue through the hardtimes.

There are some useful techniques in setting up your business:

Constantly re-evaluate your business process, if you are doing minor tasks to make your business run (like repairs, tools etc.) consider expanding your business (James example was with boats, if you rent boats and motors, and spend a lot of time fixing the motors, you should also consider buying broken motors and fix them to rent or sell as part of your business).

Keep copies of everything, keep all your receipts, print outs, photocopies of cheques received for payment, bills, tax information. There is lots of software to help get organized, if you are finding it difficult to keep track yourself.

Before starting a business do a break-even analysis, which is:
Total Costs = Variable Costs + Fixed Costs
Fixed Costs are things you can not get away from (rent, insurance, salaries, equipment, etc.)
Variable Costs are any additional costs it takes to create a sellable unit (material, labour, etc.)
The goal of the break-even analysis is to see how much you have to sell to break even, everything above that is profit, everything below that is a loss and worth re-evaluating how if the business will make money.

There are multiple setups for creating a business Proprietorship, Partnership and a Limited Company (or Corporation). After reading the pros and cons (and though I am not a lawyer) it seems like almost everyone would recommend a Limited Company. It is a little bit more money to get started, but offers a lot more protection to you as an individual (unless you have signed a personal guarantee). The difference is, with a sole Proprietorship you and your business are considered "one entity" and can not easily be sold or transfered, may not be eligible for government loans and creditors can go after your personal assets (house, car, etc.) if your business is being sued. With a limited company the corporation exists as a separate legal entity and also offers better government tax considerations, and is eligible for government funding, can be sold, and is generally perceived as having more status in the marketplace.

There are a number of requirements that you should be aware of, they may include any or all of the following.

All Businesses:
Registering for GST (to recover money spent on business purchases).
Privacy Acts (acts for collecting personal information of clients)
Sales Tax (tax on all tangible personal property that is purchased or imported)
Patents (trademarks, copyright etc.)
Product Standards (CSA approved)

Manufacturing:
Licenses (loan money, handle food, transport items, manufacture, modify natural resources etc.)
Federal excise tax (for any manufacturing)
Custom duties (if dealing with imports and exports).

Employer:
Federal income tax (if you are an employer, need to handle the income deductions)
Employment insurance (if youa re an employer and not a proprietor)
Canada Pension Plan (if you are an employer)

Of course there is more to it, but it is likely that if you are a game programmer (potentially an indie programmer of one), you may not have to worry about the employer aspects or many of the manufacturing elements, but it is still worth while to have some knowledge of what is involved. All in all, if you need more advice consult a lawyer or accountant, they will likely help set you in the right direction. There are some tax benefits you can look at (even working from home), but you will have to do a bit more research on this part of it.

Fortunately there is lots of support the Business Development Bank of Canada (BDC) offers a wide range of services for all stages of business http://www.bdc.ca/. Canada Business Service Centers offers information about government services and programs: http://canadabusiness.ca/

All I am saying, is that do not be afraid of something that millions of people already do: http://www.ic.gc.ca/eic/site/sbrp-rppe.nsf/eng/h_rd01252.html. I'm not saying go out and create a business now, but if you have that next great idea, do not let the unknown fear of "what is a business" stop you. Sure, if you have never done something like this before it is a little out of your element, but like everything it will become second nature with a bit of knowledge and practice. You can even check out the information involced online: http://www.cra-arc.gc.ca/tx/bsnss/tpcs/bn-ne/bro-ide/menu-eng.html (sorry international friends, I hope you can find a similar link).

Best of luck,
Michael Hubbard
http://michaelhubbard.ca

Monday, July 18, 2011

Book Review: MEL Scripting a Character Rig in Maya

The book MEL Scripting a Character Rig In Maya by Chris Maraffi, is an intersting book on how to automate more of the rig creation (something I am a big fan of). Maraffi has a video describing his tools on youtube at http://www.youtube.com/watch?v=x4s6-ahJWvU. This is a good book for those interested in MEL scripting and rigging, and has some more advanced concepts (such as creating a full UI for the rigs and some tools to create and bind rigs to a mesh.

Overall, the majority of the book covers the creation of a basic FK rig to a more complex IK rig, with a lot of focus on MEL scripting and setting up the GUI. The end result is a very nice collection of controls and GUI elements to drive the character, as well as some neat more advanced controls/scripts for handling the character's breathing, eye jitter and stretchy spine.

While it would be ideal if the code base was in python instead of MEL, it is understandable that this is still the most popular Maya language. The code base works and has a lot of comments which will help the beginner, but is also fairly reliant on its own naming conventions, which also results in a lot of magic strings to keep the code together. I also feel that the code may have benefitted a little by being broken into smaller, simpler functions. There are a number of cases where a function seems to do a lot, and while it is easy enough for those with a developer background to understand, I always feel that fairly long functions are more difficult for the beginner to wrap their head around. The Maya GUI programming is pretty straight forward, it is easy for a Maya GUI script to get quite messy, but Maraffi does a prettty good job in keeping everything organized and logical with the GUI examples.

All in all, a good book and good examples and code base of rigging in Maya. I would recommend it to those interested in rigging and MEL script, although would recommend those following along with experience in python to look at converting the scripts to that instead.

I am really enjoying automated breathing in the rig, it is pretty cool.

Later,
Michael Hubbard
http://michaelhubbard.ca

Thursday, July 7, 2011

ATempo Digital Archive Event

I just got back from an Annexpro event http://www.annexpro.com/news-events/. The event was held at the Annexpro office and was "Managing Digital Content Throughout Your Workflow with Backup and Archive" presented from Atempo http://www.atempo.com/ on their Atempo Digital Archive (ADA): "a long-term data storage system and file archiving solution aimed at mid-market and larger organizations".

The product looks pretty easy to use and sounds fairly easy to setup
with a nice GUI, drag and drop interface, and good support for both backups and archiving. The interesting part was some of the integration directly into Final Cut Pro and Final Cut Server (and more Avid and other products on the way), as well as drag and drop from Finder (on the Mac, although supported on multiple operating systems). It looks like Atempo are partners with lots of the major players in software and hardware as well as dealing with government, and I am always interested whenever NASA is mentioned (although how involved is hard to say).

The focus was on digital content and being able to optimize SAN or NAS storage by archiving data to
low cost disk or tape archive, with easy retrivial. One of the nice features of ADA was its ability to have the files appear as if they were still on disk while they were archived and made it transparent to the user. Another interesting approach was automatic archiving which allowed rules to be setup to automatically archive an entire folder (such as if a file is over a year old and has not been accessed in a month, archive it). There was also quite a bit of metadata that was involved which allowed it to easily store the data.

One of the things that would be a nice to have, was an in place versioning system. The answer to this could be by archiving separate copies for each version, but would be nice to have built in (similar to SVN, Git, Perforce or AlienBrain) but also automatically archived. Really good
(preferably open source) binary versioning with backups, minimal storage and ease of navigation and retrieval of data is something that seems to be lacking in most studios as there is no definite solution. The amount of memory used with something like SVN on binary data quickly becomes unusuable, as networks and harddrives begun to struggle with 100s of gigs or multiple terabytes are in play.

Still and interesting presentation and it is always good to see what kinds of solutions are out there.

Bye for now,
Michael Hubbard
http://michaelhubbard.ca

Wednesday, June 22, 2011

Event: Siggraph "The Art of Lighting and Rendering Rio"

I went to the Siggraph event "The Art of Lighting and Rendering Rio" for the new animated movie Rio http://siggraph.ca/festivals/information.php?fest=20110621-RO which featured Jim Gettinger, Lighting Supervisor at Blue Sky Studios. Blue Sky Studios (New York) is owned by Twentieth Century Fox and has a pretty good 3D rendering history including Ice Age movies, Robots, Horton Hears a Who and more.

Gettinger talked about a number of aspects that were important to lighting Rio as well as initial concepts for creativity and direction. The first focus in the 3D movie is chosing the style. For Rio they went with ~80 percent realism (higher for human characters), 20% stylized, which can be seen in the stylized yet recognizable shapes in the movie. Probably the most interesting bit of news was the proprietary tool called "Studio" that Blue Sky Studio uses. I suspect it is a layer that sits on top of Maya and has its own renderer or sits on top of Mental Ray that allows for more level details to be exposed through tools for the artists. Some of the functionality includes:

- Voxels for feathers, plants and eventually buildings.
- Procedural textures on all background terrain and buildings (including all of the panoramic screen shots in the movie).
- Atmospheric pass that interacts with the raytracer.

There is not a lot of information about the CGI Studio propreitary software, but there is a link describing a little more at http://www.blueskystudios.com/content/process-tools.php but that was of course the most interesting part of the talk for me.

Some of the other information that was interesting was they had up to a 13 hour render for some of the frames on shots on Rio. There was a lot of compositing work and complexity of the shots (reflections, shadows etc.) dealing with the human characters' glasses. Some of the work invovled creating a separate pass for blending with refraction index of 1.1 (instead of 1.5 for glass), to give a more realistic look. There was also quite a lot of research and development done on the skin of the human characters, especially related to subsurface scattering. They needed to update subsurface scattering for different areas of face, creating additional maps for those areas that need scattering like on the character's nose.

An interesting talk, in an interesting building (looks like an art gallery on the outside, a bar on the inside and a theater inside that :P).

Looks like I will have to check out that movie now,
Michael Hubbard
http://michaelhubbard.ca

Tuesday, May 31, 2011

"Make Your Dreams a Reality" 3vis Event in Vancouver

I attended the 3vis event "Make Your Dreams a Reality" in Vancouver http://www.3vis.com/enCA/evenements which showcased a lot of Autodesk 2012 software including 3ds Max, Maya, Motion Builder, Mudbox and XSI. 3vis put on a nice show with some prizes, food and open bar :) The event was broken up into three major presentations.

The first presentation was by Louis Marcoux, who talked about 3ds Max and Motion Builder. 3ds Max has a new "Nitrous" viewport in 2012 that has multithreaded support, SSAO (Screen Space Ambient Occlusion), Soft Shadows and Indirect Illumination, which held up pretty well in a fairly detailed model of a town. One of the cool new feature is the stylized rendering which allows for different fx to be applied to the render, which give the look of Photoshop filters. The substance procedural textures was also pretty interesting, and looked like you could create a wide variety of pattern textures with minimal effort. There was also a small demo on the iRay renderer, which looked pretty nice, although the fact that you choose when it is complete makes me wonder how much extra processing time will occur with no visibile improvements as people do not know the correct range to get the results they want... I would have to play with it a bit to see if this was an issue I guess. Louis also talked about how easily the Autodesk packages integrate with each other, and gave an example of moving between Max and Motion Builder, where he had setup some motion capture data with his Xbox Kinect... I will have to look closer into some of this Kinect stuff, with my Kinect soon.

The next talk was by Lee Fraser on Maya 2012 and Mudbox 2012. The talk focused on some of the new improvements with Maya Viewport 2.0 for more realtime rendering options (which is great for closely matching game rendering). Maya 2012 has improved audio options, sequencer and "uber cam" options for in Maya editing of shot sequences. Some of the fluid option for liquids in Maya give some nice realistic results. There are also some improvements to the modeling system that allow curves to be used to shape and cutout polygons from a polygon mesh. The Turtle render is used for advanced baking options for lightmaps, and had some nice results. The Mudbox options included support for Linux, general improvements for mapping textures onto geometry, and improvements for organizing UVs.

The final talk was by Mark Schoennagel and his talk was on XSI, specifically on ICE. From a developers perspective this was probably the neatest talk (from an artist probably not the most intersting), as Mark went through the process of creating different ice nodes for both custom and recreating built in XSI features (which showed how almost any effect could be reverse engineered using ICE). Some of the most interesting things were using ICE to create an internal ray tracer within XSI, effects that change particles from one form into another, and the many ways ICE can be used to create cool fx.

This was a lot of fun, and I learned some cool things.

Bye for now,
Michael Hubbard
http://michaelhubbard.ca

Sunday, May 22, 2011

Full Indie Meetup

I went to another Full Indie meetup http://www.fullindie.com/2011/05/12/full-indie-anniversary-event-thursday-may-19th/ this one had a talk with Chris Stewart from Barking Dog studios and Andy Moore from Radial Games. The talk was good, quite a lot of people coming out, and a good motivator for going home and working on your own games.

I especially like some of the stuff that Chris had to say, it was along the lines of "fake it till you make it", but in more graphic terms. From his experience, there can be a lot of stress in being a indie game dev, and a lot of the business aspects of the job can prove to be more challenging than expected, even for a developer that is used to a lot of hard challenges. He mentioned that nothing will "kill you", and you just have to go out there and do it. The talk also contained some aspects of how to deal with people. People are the most difficult part of any company, and it is important to do your best to move everyone in the same direction. It is useful to group artists and developers together since they are the least likely to talk, and it is worthwhile in a small group that everyone has a lot of communication with each other, since there is often not someone around to manage all the important aspects.

Overall, I think that the talks were good, I did not stay too long as I had other plans, but look forward to more of these at some point.

Cheers,
Michael Hubbard
http://michaelhubbard.ca

Sunday, May 8, 2011

Book Review: Game Programming Golden Rules

The Game Programming Golden Rules by Martin Brownlow offers nine chapters on various game programming topics. For this review I will go through the chapters with a brief comment on each of them.

1. Embracing C++: This chapter is all about using the compiler to your advantage, the preprocessor, macros and other C++ goodies that are not so much game specific and it is making sure you know the language you are working in, and how best to save you time.

2. Orders of Complexity: Some Big O notations, Self Balancing Binary Trees, Red Black Trees, and Binary Space Partitioning (BSP) Trees.

3. Hashtables: Including creating hashtables and hash functions, reducing hash collision frequency, file name hashes, and probably the most interesting part of this chapter DNA hash for vertex shaders. Localization of text assets was also in this chapter, although I think that for a large game, an approach that used key value pairs in the same file is more maintainable than having the keys and values in separate file, since if a file gets very long it will be difficult for the person writing it (often more proficient in language than coding) to debug why the lines are not matching up properly.

4. Scripting: Creating your own scripting language with some sub rules: scripts should be easy to create, scripts should never crash the game, scripts should be dynamically loaded, scripts should handle multitasking, scripts should have a debugger, scripts should be extendable, scripts should never break the game build.

5. The Resource Pipeline: Handles some of the resource pipeline tools, similar to the Game Asset Pipeline book, suggests an intermediate format for all assets that the artists create, that has everything possible that you can think to include in the intermediate format (version numbers, dates, authors, etc.). The book also focuses on some file compression, platform differences, and cummulates in a version of a build assistant (called DataCon by Brownlow), which is a simple example of what should be the case for all development teams, a one-click build system.

6. Processing Assets: Includes, fonts, images, geometry and creating triangle strips.

7. Finite State Machines: Explicit vs. implicit state machines, scripting FSMs, creating combos, linking FSM objects, indirect animation lookups to allow the same FSM to be used for each character type (with different animations).

8. Saving Game State: The difficulties of saving game state, saving and loading a game and autodetecting object changes with a saving template.

9. Optimization: Measure Twice, Cut Once, using profilers, realizing that some optimizations do not actually improve the overall process, since all the moving parts can actually make it slower.

Overall the book was interesting, although a lot of the techniques I think could apply to any field and are not just game specific. I think this book would be good for beginners to intermediate and the source code that is provided is pretty clear for the examples.

Happy coding,
Michael Hubbard
http://michaelhubbard.ca

Sunday, May 1, 2011

Book Review: The Game Asset Pipeline

The Game Asset Pipeline by Ben Carter is interesting in that it is probably one of the only books out there on asset pipelines for games. The books is broken into too major parts: high level asset management, and low level processing details. Overall, the book handles these issues effectively and with some interesting insight into pipeline work in general.

Carter suggests a number of useful tips for those working on a asset pipeline. The output of all systems should be a custom intermediate file format. This will allow the intermediate files to be processed and transformed into final assets without the need to re-export all of the assets from the 3D applications (or other tools). By having them as simple text files, it will also allow any additional last minute processes, without the huge overhead of having to re-export from multiple tools (especially since the end format is likely to change). Since ideally the format should be as close as possible to what will actually be contained in memory while the game is running, the intermediate format can allow for processing into the final assets, without having to constantly re-export the same assets everytime the end file format changes (and it likely will a lot, especially at the end). Also the tools should do as much preprocessing as possible to ensure speed at game runtime.

The eternal debate of text vs binary files is also brought up in the book. It is recommended that text files are used (although binary can be brought in later if necessary), but more so than just using text files, is the idea of text files with an implicit structure. If a text file has no structure such as:

Fred
Solider
Aggressive
1000


It becomes very difficult to add things in and the parser will likely be confusing since it is expecting things in a certain order. A much better approach is to have the structure built into the file such as:

CharacterName = Fred
CharacterType = Soldier
AIType = Aggresive
HitPoints = 1000


With that kind of approach it is much easier to see what the structure is, and with a parser of this type, things can be added, reordered or removed without issue (such as if the AIType does not have to be set or whatever). A still better approach to the plain text file is some sort of structured file format like XML or JSON. While I like JSON for most things, as it closely resembles data structures, it is important to note that once you start getting into very large files, it becomes trickier to navigate all the closing brackets in JSON, whereas XML gives all the end tags which is somewhat easier to navigate. Both are entirely capable of getting the job done, but it is best to look at what languages you are using and what kind of support they have for parsing (C# and XML go hand in hand, as do more web languages like Javascript with JSON).

Carter continues with some of the high level asset management and that it is different than source control (although there are some basic similarities, the size of the assets, the number of binary files, etc.) makes for different challenges in an asset management system. The most important aspect of an asset management system are turnaround time, metadata about the assets, versioning and handling broken data. The book looks at a number of subversion systems including SVN, Perforce, CVS and Alienbrain. The use of databases (like MySQL or Oracle) are not usually the best options unless the database is setup to allow for huge file sizes.

The concepts of a client/server architecture for the asset management system is essential, and should support locking and unlocking files, versioning of files, preview functionality, showing modifications and some sort of caching system to improve performance.

The rest of the book is on more low level details such as texture, geometry, environment processing, asset dependency analysis and final data output. It is interesting, but is more platform specific than the high level concepts on asset management in the earlier chapters. In some ways, I would have preferred more of an expansion on the asset pipeline aspects and less on the low-level details (perhaps splitting up the two major topics into two separate books and get everyone to buy both :P)

Overall a good book on many of the concepts of the asset pipeline for those not familar, and is a good resource on a topic with not a lot of literature on it.

Bye for now,
Michael Hubbard
http://michaelhubbard.ca

Friday, April 22, 2011

Book Review: Game Engine Toolset Development

Not so much a programming book as much as it is a look into the entire world of tools development, Game Engine Toolset Development by Graham Wihlidal is an interesting read for those wanting to improve the production pipeline or tool creation. Now that I my current job is focused more on the technical artist/programmer instead of a team lead game programmer, I figure I should take a bit more time on the technical artist and tools building aspects (which are for me back to basics). The book focuses on a wide range of topics, and focuses more so on implementation details than tool design.

The book begin largely on the politics of tools and maintenance and different stakeholders in the tools, as well as some examples of different environments in which tools are created and maintained. There are a few examples of professional studios, who focus on elaborate GUI systems, and WYSISYG (what you see is what you get) world editors.

From there, the book has a chapter on why to create tools with C# with .Net (which is the language and framework used throughout the book). While I agree that .Net can be a great and quick framework for developing solid tools quickly, I would recommend sticking with the language that is closest to the game engine or 3D application whenever possible (since it is often the case that some parts (or the entire tool) can be merged into the game code, and it is better to not have too many languages in any production environment (if possible)).

There is a lot of focus on coding standards, design and documentation which is extremely important for tools, since they are often written specifically for one project, and can be very difficult to port to other projects if the design and standards are poor. Plus, the fact that many tools are written once and then the programmer moves on to something else, it is easier to forget why the code was written a certain way when come back to it in six months. Having a good design and (up-to-date) documentation is very important if no one is going to be looking at the code until the next project, or something breaks. There is some discussion of code metrics, UI design, using NUnit for unit testing, Microsoft coding conventions and using FxCop to enforce coding policies (I also like StyleCop for coding standards). The coding documentation is with NDoc, which I am not as familiar with, but will have to check out sometime (I currently like Doxygen since it is useful for a lot of languages).

All in all, a good book that deals a lot more with the entire job of tools development, which is somewhat unique in terms of computer books. I realize that for some people (and companies) the need for tools is not quite so apparent. The best analogy I can think of is that tools are the "scaffolding" in which you can build up any project. They solve the simple problems, help tackle the larger problems, glue together what needs to be connected, and improve the productivity of the entire department. Without scaffolding tools everyone would be trying to add another floor to a building by hanging out a window, and if that is the case, you are not going to be able to build quite so high...

Keep reaching for the top,
Michael Hubbard
http://michaelhubbard.ca

Saturday, April 16, 2011

Full Indie (Game Dev Event)

In order to keep up with some of the game development stuff, I decided to attend Vancouver Game Indie community (Full Indie) demo night: http://www.fullindie.com/2011/04/12/april-event-game-demo-night-rocked/ It was interesting to see what Vancouver had in the way of indie developers, and looked like there was some good stuff. I suspect that a large percentage of the community was students, but it is always interesting to see what the professional indie developers are up to.

All of the games were interesting, but I especially liked Justin Smith's No Brakes Valet, which is a flash game he created at a game jam, in which you attempt to park cars in different parking stalls, but occasionally there will be some with "no brakes" which makes the car crash into things. He also had a good quote about this: "if you are good at the game, it should be fun, if you are bad at it, it should be fun" (paraphrase). This is good advice for any game designer, and was especially true in his game when the cars banged into each other (without brakes) and caused the other cars to bounce around making a really enjoyable mess. Overall a good experience and look forward to seeing what else it out there (who knows, maybe I will get a chance to contribute something sometime in the future).

Cheers,
Michael Hubbard
http://michaelhubbard.ca

Sunday, April 10, 2011

Search Engine Optimization (SEO)

While I usually spend my time dealing with game engines and pretty pictures, it is always good to know a bit about what is going on in the web too. I find that web technologies move very quickly, and while sometimes there are fads, often they will be coming up with new technologies that are worth taking advantage of. While I am looking into these things, I enjoy my blog and website http://michaelhubbard.ca and it is always interesting to see how to improve the look of and use of your website.

The article at the adlibgroup http://www.theadlibgroup.com/headlines/create-a-website-google-will-want-to-show-off has some good tips and has some interesting facts I did not realize (like the all importance of the title tag). The idea of searching for your competition is also an important one if I was to ever make a business website, although updating content everyday will become a full time job :P

Until then, I will look at applying some of the tips to my own website, since if you are going to do something, might as well do it properly.

Wish me luck,
Michael Hubbard
http://michaelhubbard.ca

Friday, April 8, 2011

Game Review: GameDev Story

There is a fun little game GameDev Story for the iphone/ipod http://itunes.apple.com/ca/app/game-dev-story/id396085661?mt=8 that I have been meaning to write about. The game itself is pretty addictive, and kudos to the idea and execution (if you haven't tried it, it is definitely worth a look, and some of the games you compete against like "Street Cleaner 2" are pretty funny). That being said, the thing I liked was the focus on the people, and getting the right people can really make or break your success in the game industry.

With the focus on people I think a few additional attributes would be appropriate/realistic for those who have never worked in the game industry:

1. Training time for new hires. It is nearly impossible to expect people to jump into a new or established company and be as productive as someone else who has been there a while. It would be interesting to see what replacing someone at the start compared with at the end would do for the training.

2. Employee moral, the devs come and go and often appear to be working all hours of the day (some just leaving while others are coming in). It would be interesting to have some sort of happiness meter that decides whether or not the devs decide to stay at the company or go work somewhere else.

3. Teamwork. How well to the devs work together? It would be interesting that those with great people skills and coding skills are not always in the same package. What kind of results would happen if the devs were not working well together, or there were personalities that clashed. This would also go with employee moral, as it can be a burden on the entire team, even if it is only two that are having problems.

4. Budget vs Time: In the game you are not really punished for fixing the bugs, and can release it earlier if needed. Instead, it would be interesting to factor in some sort of delay mechanism, if a project is not on track and how to get it back on track.

5. More human factors (sick days, holidays, vacations, bad days, stressed days, tired days, etc.), also would be intersting to see how well the work life balance played out for these happy little characters :P

For those looking at running a game studio, the best advice I can give is to join one yourself, even if you don't learn exactly what to do, learning what not to do is also a useful experience.

Best of luck,
Michael Hubbard
http://michaelhubbard.ca

Wednesday, April 6, 2011

Technical Artist at Bardel Entertainment

So I am just a few days into my new job as a Technical Artist/Software Engineer working for Bardel Entertainment http://bardel.ca/ in Vancouver. It is pretty exciting working with some old friends from Leap In Entertainment, and the work that goes on here is some high quality 3D animation. It is a little different than being a team lead game programmer like I was at Ganz http://www.ganz.com/(in Toronto), but I am happy for the change. I enjoyed Toronto and made some friends there, and wish them all the best, but Vancouver is my home, and I am glad to be back.

I will likely be delving more into personal game development and hobby projects, so I am looking forward to some of that and will see about posting something intersting soon.

Back baby!
Michael Hubbard
http://michaelhubbard.ca

Saturday, April 2, 2011

Python Design Patterns

There is an intersting talk about Python design patterns at http://www.youtube.com/watch?v=4KZx8bATBFs&feature=related from Google Students. I really liked the comment that "Design Patterns must be studied in the context on the language in which they'll get implemented" as some patterns can be change or disappear entirely if there is a better way to do something that the language supports.

One of the interesting talks on everybody's favorite/hated pattern is how to deal with a singleton. Singleton's are conceptually good (just one of something) but in practice they so often become just global variables as noted here http://c2.com/cgi/wiki?GlobalVariablesAreBad I personally prefer using dependency injection whenever appropriate, as it gives a lot more flexibility in what class the objects are actually interacting with. However with python there is an intersting approach in using the module as a singleton. This has its own issues (no subclassing modules, etc.) but does provide an interesting approach to think about (especially with the lack of private methods/members in python).

Good luck,
Michael Hubbard
http://michaelhubbard.ca

Friday, April 1, 2011

Book Review: Python Developer Handbook

A worthwhile tome of knowledge, Python Developer's Handbook by Andre Lessa is a good introduction to Python in all forms. While it does not deal with Python 3.X it still gives quite a lot of examples about how complete python is as a language, and how useful it is for tools. As one of the big three languages used by Google, one of the most popular languages in academia, API implementations for some of the big 3D tools (Maya, Blender, Fusion, etc.) and one of the most talked about languages in the world http://langpop.com/ Python is likely going to be around for a while.

For those that don't know, Python is also named after Monty Python (the greatest sketch comedy show and some of my favorite movies of all time), and throughout the book (as is Python tradition) to use examples from the sketches as part of the fun. Such as:
print "I'm a lumberjack and I'm ok!"
The book start with the basic syntax, exception handling, and object oriented programming and then moves to more advanced topis such as extending and embedding python, distribution, databases, and threads. There is a significant section on network programming, web development data manipulation with XML. The book covers both some higher level modules and lower level details. Along with the network programming is some information on GUI and graphic elements, as well as UI elements with Tkinter.

The development tools mentioned in the book are IDLE, but I would recommend Eclipse + Pydev as the way to go. Pydev is a great IDE, and the debugging options from Eclipse are really great.

I use Python for creation of tools, Maya API scripts, and more, but learning more of the networking tools makes for a more complete game programmer/technical artist :)

Bye for now,
Michael Hubbard
http://michaelhubbard.ca

Sunday, March 27, 2011

Javascript Defense

Ok so I have gotten a little bit of response from a few guys at work on my post http://gameprogrammertechnicalartist.blogspot.com/2011/03/programming-languages-to-know.html with Javascript coming in at number two. Let me start by saying I know there is a lot of bad Javascript code out there. Some of it is ugly, some of it is even uglier than ugly, but what is written in the language is not the same as the language itself (the same case can be made for English).

If you do not have the chance to go to school for programming and are purely self-taught, Javascript is probably the best language to learn. You will have a lot of options to learn more core concepts and lower level details as you delve into programming and encourage even the most casual programmers to explore. The one thing I can recommend is to go for books over website examples. Sure there are some great examples online, but there are so many poor examples it is often hard for a beginner to know the good from the bad. In the case of books, they are usually a higher quality, just be sure to search out any errata or bug fixes that may be available in the case of mistakes.

One video I would recommend any Javascript enthusiast is http://www.youtube.com/watch?v=hQVTIJBZook by Doug Crockford (creator of JSLint and developer of JSON), who talks about his book JavaScript: The Good Parts (which I may have to pick up at some point). One of the best quotes from the video and good advice to any programmer or language developer is it is "easy to make things bigger, hard to make things better".

For those too busy to watch all the video, you should check out http://www.jslint.com/ for any and all Javascript but beware as "JSLint will hurt your feelings".

I understand that with any language, there will be those who vehemently attack or defend it. In the end, it should always come down to the right tool for the job. For certain tasks (like I don't know client side webpage scripting), Javascript is a fine tool to use.

Cheers,
Michael Hubbard
http://michaelhubbard.ca

Saturday, March 26, 2011

Book Review: The Practice of Programming

A good book for those just starting out or wanting to look at programming again with a fresh set of eyes, The Practice of Programming by Brian K. Kernighan and Rob Pike has a good approach to teaching programming. Instead of focusing on syntax or abstract concepts right from the get go, the book instead looks as programming as a disipline with concrete examples.

Things like thinking about style of the code you write (use short names for private local variables, longer more descriptive names for more public variables), being consistent with the work, using active names for functions, breaking up complex expressions and algorithms and data structures.

Probably the most interesting coverage of the book is the Markov chains program that is written in multiple languages C, C++, Java as well as some information of Awk and Perl. Seeing how the same functionality can be acheived in multiple languages is very informative, and the book shows how the transition and design changes with the language.

Some of the chapters on debugging are also great, and the advice and examples are useful for everyone. Also, really taking time to think about the code and what it is doing, even in a hectic fastpaced environment is very good advice, and sometimes gets lost in the pace of the moment.

Overall, it is a good book for beginners, easy to read and to the point (and not the size of a phone book like many computer books). I would recommend it to anyone with an interst in programming, especially those just starting out.

Cheers,
Michael Hubbard
http://michaelhubbard.ca

Thursday, March 24, 2011

Book Review: The Productive Programmer

An intersting book The Productive Programmer by Neal Ford has some great tips for the everyday and really brings out interesting concepts about the daily tasks of programming, as well as some good advice such as "Don't Shave Yaks"(where you can delve so far into a problem you end up solving everything but the problem) and YAGNI (You Ain't Gonna Need It).

The book focuses on useful tips, but really emphasizes reflecting on the kinds of stuff you do. If you begin to see ways of doing something better, you will look for it all around you. Things like, if you always have to navigate to the same directory over and over again, create a shortcut. If you open up the same files every morning, use an accelerator or automate that to do it for you. If you have to send the same email report every week, automate that too.

Probably my favorite idea from the book is "search trumps navigation". Anytime you have to find something, you waste both brain power and time when you have to look through folders or files to find the one you need. Instead, use tools to find what you need. For my Visual Studio friends, I always use ">of myFileName" in the search bar to get the name of the file (this quickly became the most popular way of searching to those I introduced it to at Ganz :P). This is a lot faster than having to search for that class or file through thousands of files. For those on Windows that really miss out on grep (Ford mentions cygwin) but I recommend giving Astrogrep a try http://astrogrep.sourceforge.net/ .It is free open source, allows regular expressions and displays the contents of the file (you will never go back to Windows Explorer for searching again).

Ford also focuses on aspects of eliminating distractions: turning off popup messages, using rooted views for a directory, and automating everything that usefully can be (interaction with websites, spreadsheets, rss feeds, build machines etc.). The later chapters focus on more code examples such as test driven development, source code analysis, as well as some philosophy and psychology applied to coding.

All and all a good book especially for the ideas it will bring about of how to improve your own productivity.

Best of luck,
Michael Hubbard
http://michaelhubbard.ca

Sunday, March 20, 2011

The Programming Languages to Know

The programming languages to know (for Game Programmers and Technical Artists), but can really be applied to almost any programming task:

1. C++: The language to learn, and while I like C, object oriented programming is nearly always a better approach for large game projects. This is the language that job market talks in, is the language you want to be taught in university and the language you should be the most comfortable with. While a number of other languages are gaining interest (see C#) the low level control and history of code written in C++ will keep this language in the running for a long time (or at least until C++0X).

2. Javascript: While this may surprise some, this is the language for Adobe product pipeline scripting. Need something automated for Photoshop, Illustator or After Effects? Of course you do... always automate any tasks can be automated, and with javascript this will be one of the only options that these products have API support for. It was also great for some XSI scripting back in the day. Oh and it doesn't hurt that it is the language of client side websites either.

3. Python: This is quickly becoming the language for scripting and glue logic. Once, I would have put Perl here which is great for tasks you need a "swiss army chainsaw", but if your script needs to be maintained or is very long, it is often better to go with Python's object oriented approach. Maya including Python (along with MEL) also bumps this signficantly up the list.

4. C#: Really? Sure, more indie engines like Unity or XNA are worth picking up and playtesting with. While you can use Javascript with Unity, for a very large project it is worth going with C# and the tools (Visual Studio or MonoDevelop) that go along with it.

5. Cg (shader language): If you learn this NVIDIA shader language you will find both HLSL and GLSL a snap. If you are looking at Unity's ShaderLab or the very cool cgFX you should definitely pick up this language in your toolkit of skills.

Other languages that are great to know about are web based languages like Ruby or PHP, some other more unusual languages like Prolog or Scheme (so that you can think about solving problems in different ways) and of course knowing and using a little assembly never hurts (if you really need to optimize).

You will likely look at other languages in your career: such as Java, Obj-C, Lua and more. In fact, the more languages you know and practice with the easier that it will be to transition to other languages. Most languages (especially the popular ones) have lots of good books on them, and lots of examples, but don't worry so much about the syntax anyway, learning the syntax and key words is only the first step. Learning how to program effectively can be done in any language, focus on the language you work in, and learn it inside out. Learn how to architect software, break down problems into solutions, how to optimize and debug what you have written, and what it really comes down to is being able to think in a programming language. When you learn other languages, picking up the syntax differences and a few specialties in the libraries shouldn't be too hard to pick up after that.

If you really want a challenge, you could always try http://www.muppetlabs.com/~breadbox/bf/ or try some of the more obscure ones at http://c2.com/cgi/wiki?HelloWorldInManyProgrammingLanguages and http://www.99-bottles-of-beer.net/

But only if you are up for a challenge...
Michael Hubbard
http://michaelhubbard.ca

Thursday, March 17, 2011

RateMyEmployer.ca

In case you did not know there is the site for rating employers: http://www.ratemyemployer.ca/ But how useful is it really for you as a potential employee? Well, from what I can tell, most of the time the ratings are generally pretty negative. This has to be taken in context, and a company that has a lot of employees but not a lot of ratings may potentially be a decent company, while a smaller company that does not have a lot of employees but a lot of negative ratings may not be very good. If you are lucky you will have a lot of comments and will be given reasons for why they were rated that way, either positive or negative. Having only one comment (positive or negative) is usually not enough information to accurately assess whether or not it was just one happy or one angry employee posting from a good or bad day.

For those of you still in school also check out http://www.ratemyprofessors.com/

Regardless, if you are really interested, it never hurts to do a little research. If you are scoping out a new job, try and check it out before you apply. See if the people are happy working there (if you can ask them), check and see if people flood out at 5:00, if there are a lot of fancy cars in the parking lot, if people are smiling or are walking zombies. What kind of stuff has the company produced, do they have a good track record of many successful products, or are they struggling? Another good indication is how much activity (new and leaving) does a company get on Linked In ? Do they have a lot of turn around, and what kind of people are they bringing in (experienced, or fresh out of school)?

It never hurts to do your homework, after all, it is your life.

Best of luck,
Michael Hubbard
http://michaelhubbard.ca

Tuesday, March 15, 2011

Book Review: Taming the Tiger

An interesting book Taming the Tiger: The Struggle to Control Technology by Witold Rybczynski, is not a programming or art book, but rather an interesting reflection of technology throughout history. The book brings up some intersting concepts, man as the prostetic god (able to extend his reach, life, travel and abilities far beyond natural levels all through technology) and the idea of houses as a comfortable living machine that we stay reside in.

The book also includes how technology is prevades all aspects of our society. People who barely remember their highschool physics are require to take sides in the nuclear power debate, "the fragmenation of modern society and reduction of shared experiences means people form opinions on most subjects on the basis of second hand experiences" (Rybczynski, 26).

Rybczynski delves into a number of historical backlashes against technology (Captain Swing and the Luddites, industrial revolution risings, Ford assembly worker overturn etc.) and the inability to "rollback" technology once it has been introduced (only delaying the inevitable). There is also a lot on the "Shock of the Machine" and how cultures try to adapt to new technologies.

Rybczynski has some great quotes about how technology is intertwined with our own society it is difficult to talk about (or tame)."Part of the difficulty of taming the tiger is we can't see the animal clearly. It is easy to identify the boldly striped beast in a cage, but in the splotchy light of the jungle its colors become confused with background shadows. So too with technology. It is easy to discuss in isolation, but immersed in the opacity of human culture its outlines frequently become indistinguishable from its surroundings" (Rybczynski, 213).

"Just as we have discovered that we are a part of the natural environment, and not just surrounded by it, so also we will find that we are an intimate part of the environment of technology. The auxiliary "organs" that extend our sight, our hearing, and our thinking really are an extension of our physical bodies. When we are able to acept this, we shall discover that the struggle to control technology has all along been a struggle to control ourselves" (Rybczynski, 227).

Interesting stuff,
Michael Hubbard
http://michaelhubbard.ca

Wednesday, March 9, 2011

Book Review: Death March: The Complete Software Developer's Guide to Surviving 'Mission Impossible' Projects

A great book by Edward Yourdon, Death March: The Complete Software Developer's Guide to Surviving 'Mission Impossible' Projects. Being part of a crunch turned to death march projects, this book speaks volumes on the subject and is a worthwhile read (hopefully before you start on a project). For a death march project to occur, one or more project parameters exceed the norm by more than, 50%, time or resources needed that are not available results in a death march. Yourdon continues with the disturbing news that "Death projects are the norm, not the exception", even with the data that "the average project is likely to be 6 to 12 months behind schedule and 50 to 100 percent over budget."

How do death marches occur? The book describes a number of possible catalysts: company political reasons, naive promises made by marketing, senior executives or naive project managers, naive optimisim of youth (build any system over the weekend), "marine corps" mentality (no sleep necessary for "real programmers"), intense competetion, startup mentality, or unexpected crises. Yourdon stresses the political aspect as the most likely candidate of a death march. The politics can boder on the completely irrational: "Hysterial optimism, which is when everyone in the organization desperately wants to believe that a complex project, which has never been attempted before, but which has been realistically estimated to require three years of effort, can somehow be finished in nine months" (pg 10).

Why do people participate? High risk, high reward, naivete or optimism of youth, buzz of working closely together, Mt. Everest syndrome, alternative is unemployment, or revenge. The results are sacrifice personal health, mental health and personal relationships. I also talked about some of this in my post: http://gameprogrammertechnicalartist.blogspot.com/2011/01/why-do-people-crunch.html

If management expects time and budget to be padded they are expecting the estimation of schedules to be some "negotiating game", but is also likely to be some degree of naivete and lack of understanding of what really goes on. The owner of a death march is often a much higher-level manager than would be normal for the software project, usually with a long hierarchy of shareholders, stakeholders, project managers and customers in between and around. A long hierarchy often gets distorted on the way down to the project manager, where (on the way down) additional elements are tacked on to the message, bloating the project scope and timeline, with additional requirements, constraints and processes.

The major cause of death march projects is politics, often related to the negotiating games. Management always wants the enormous luxury of a promise of what something will cost, how long it will take and not have to worry about these factors, and also gives them a scapegoat for blame if the promise is broken. Senior management needs to share the burden of uncertainty and allow for changes at all levels, not bringing in someone for an on the spot estimate and building their entire project around something that has not been careful worked through. Always defer instant estimates, and always give +- estimates like 3 to 6 months, or 6 months +-50%

The book talks about four different categories for death marches:

High moral & High chance of success = Mission Impossible (a lot of hard work, but exciting)
Low moral & High chance of success = Ugly (marine corp mentality, whips cracking).
High moral & Low chance of success = Kamikaze (try and go out with one last hurrah)
Low moral & Low chance of success = Suicide (no alternative)

The interesting thing is that these values are all relative to the individual programmer. Some programmers may feel they are on a Suicide project, while others feel like the same project is a Mission Impossible one. People's commitment changes over time, almost always in a worse way.

Work has to be mutually beneficial, and if the threat of being fired or bypassed for a raise or promotion are common, then the strongest bargaining chip you have is displaying that you are ready to walk away from the relationship, if the results aren't mutually acceptable. Two specific issues that also have a significant impact on motivation, and which are usually under the manager's direct control: rewards and overtime. Bonuses are tricky and can backfire, at a startup with part of stock option available they may be a good modivation (like tens of thousands, or millions of dollars), support for the whole family, taxi service, grocery service, and helping to provide medical attention to the whole family. This costs money, but usually a very small amount. Give extended vacation (they will need it) not just a few days but extended period of time. Consider a paid sabbatical (6 months to just work on whatever they want it to be), and loaning project equipment.

The main focus of the dealing with death marches is "triage", where each task is separated into "must-do," "should-do," and "could-do" categories. After all the tasks have been separated, all work is done on the must-dos, then the should-dos and then the could-dos (if there is time). The idea behind this is if "80-20" rule holds true, the project team might be able to deliver 80 percent of the "benefit" of the system by implementing 20 percent of the requirements (if they implement the right 20 percent). Stephen Covey puts it in First Things First [I], "the main thing is to make sure that the main thing is the main thing."

Yourdon offers some other project manager tips as well.

- Project leaders should put in as many hours as possible to lead by example.
- Target 60-80 hours a week, but know your limits, everyone reaches a point of diminishing returns.
- Need to be trueful with the team, disillusioning to find out crucial information later.
- Make sure people are compatible.
- Show work in increments, but avoid prototypes which only perpetuate an illusion of completed work.
- Modifications to baseline requirements should be made public for all to see.
- Don't introduce new processes to a death march project.

At some point the death march is likely to reach an "ugly crisis" where it is shown that the project will not be completed on time, and the project manager is fired and replaced with a new project manager who is supposed to "get it out on time". The work-in-progress created by the project team before the "ugly crisis" occurs usually ends with the sad result of being thrown away: "The real reason why all of this partially-completed work ends up being wasted inventory is that no one will ever have time to come back to it. Assuming that the project team members (now under the control of a new manager, whom they may or may not respect) is able to deliver the "bare minimum" of critical functionality, they're usually so exhausted that half of them quit. And the users are so disgusted with the project that they never bother asking for the rest of the unfinished functionality; or conversely, they're so satisfied with the minimal functionality that they never bother asking for the rest of the system".

Even though everyone understands the issues intellectually, the political battles surrounding death march projects makes it almost impossible to reach a consensus on a reasonable triage. It's only when the "ugly crisis" occurs that the various parties finally agree on something that they should have agreed upon when the project began.

Try to focus on what works, not the process or methodology, or buracracy. Not to say not to have a process, since that can be a significant process, but be reflective of what works and what doesn't. Minimize number of tools needed, like a mountain climber, just bring the essentials.

Finally, Yourdon recommends quitting or leaving if it is a viable option, learning something new for less pay will make you a happier person. If any of these succeed, what is to say that they won't try it again? Some companies need to fail to see what they are capable of, hopefully it doesn't ruin the company. It is important to not measure the success of a project by the number of divorces, broken relationships and ulcers.

I have been part of a few death marches, and while a few can be bearable for a little while, in the long run you will almost always regret it. Avoid these types of projects if at all possible, everyone needs balance in their lives. If you find yourself on this type of project, the most important thing to remember is this: Life > Job

Best of luck,
Michael Hubbard
http://michaelhubbard.ca

P.S. Or if you prefer Life > Work (Life is greater than Work, for my non-programming friends), I should make it a t-shirt :P