Pixar graphics papers

A great source of computer graphics papers can be found here from the experts at Pixar, the folks behind great movies such as FInding Nemo, Cars, and the recent WALL?E. For those that don’t know, Pixar is also responsible for creating the 3D software PhotoRealistic Renderman (PRMan) that is used to create their movies as well as movies and commercials by other production studios. They are also responsible for the file format that can be used to define the scenes (Renderman Interface Specification).

They have released a number of papers that go all the way back to the 1980’s covering some of the fundamental of computer graphics, including the often sought after paper on the REYES architecture which is the primary rendering algorithm used by PRMan.

REYES (Renders Everything You Ever Saw) is an algorithm for rendering 3D graphics which can be distributed among many processors without the need to pass the whole model around unlike ray tracing and produces images with fast results. Even today it is somewhat of a rock star in rendering algorithms with many people still asking about how it works, and writing implementations of the algorithm. For some odd reason, most graphics books tend not to even cover this technique even though it is one of the most often admired and imitated algorithms. Part of it is the simplicity of the algorithm and the fact that it easily lends itself to being extended using texture and even displacement shaders as well as visual artifacts such as motion blur and depth of field blurring.

Some of the more recent papers cover the finer points of computer graphics built on the fundamentals such as soft reflections, hair rendering, deep shadow maps and distributed ray tracing. Many of the papers describe some of the technical aspects of achieving certain effects in their movies.

As a ray tracing enthusiast, I found a great paper on Ray Differentials and Multiresolution Geometry Caching for Distribution Ray Tracing in Complex Scenes. One interesting aspect of this paper is the possibility of introducing displacement maps and shaders into the ray tracing pipeline. Displacement maps or shaders modify the actual physical geometry of a surface as opposed to simply changing the appearance of the surface to make it look like the surface is displaced.

This is a great set of papers for those that are graphically inclined from some of the the best minds in the business.

The Economics of Pizza

Like most people I enjoy Pizza, especially when I order in rather than making my own, and like most people I consult the stack of coupons that arrive in the mail daily. Personally, I live in an area where there are no really good pizza places to capture my loyalty, so price becomes the deciding factor.

One trend I have noticed recently is the tendency to offer deals which are either buy one get one free, or offering value meals with a pizza, wings, salad and 2 liter for $20 or so, or offering a pizza with 1 topping for a low price. Here’s the problems with that.

It’s great for people with large families, or when your kids are having a party or you invite friends over to watch a game or something. However, it does leave a certain demographic out of options. My wife is not a big fan of pizza, so usually I’m the only one eating it (she will steal a slice or two later in the evening though). I usually prefer to have some left over (lunch the next day) and I like a number of toppings on my pizza. If pizza shops are offering to feed a family of 4 for 20 bucks, surely I can get a decent sized pizza with multiple toppings for about $10?

See, if I buy a 12 inch with two toppings for $10, and they have a buy one get one free, I end up with 2 crusts, 4 toppings and 2 cheese toppings. How about you just keep the other crust and throw the extra cheese and toppings on the other pizza? I end up with 4 toppings and extra cheese for $10 and you’ve saved the hassle of making, cooking and boxing a pizzas. Incidentally, Gina’s Pizza in Strongsville does exactly this, I can get a large (16 inch) with 3 toppings for $10 with a coupon. They are also very good.

Also, when you offer a cheap and cheerful pizza at a discount at least make it as good as your standard pizzas. Some pizza shops here have pizzas already cooked and waiting for walk-in customers to stop by and pick one up. Trouble is that there is very little cheese and pepperoni on them, and there is about 2 inches between the start of the cheese and the edge of the crust. Ok, it is a cheap pizza, but at the same time, it’s not great advertising for how well you make pizzas. If you are going to run a loss leader, at least make it something that is going to reflect and advertise your business effectively. Businesses often offer inferior products at a discount, but when doing so, they brand them under a different name, not as a ‘lite’ version. With pizza, it’s a little more difficult to do, but the pizza shops do try and brand them as discount pizzas to distinguish. However, that distinction is not always clear.

How can this translate into something more interesting than Pizza? Well, it’s worth remembering this when you are designing marketing and sales strategies. If you are going to offer discounts, make sure it is something that can apply to smaller businesses as well as bigger ones. If you offer 3 for 2 on training courses, that’s not going to do the smaller company, with only 2 developers, much good. You are giving them a free training course that they are going to throw away, or pay to send someone else who perhaps doesn’t need the training, but they may as well use the free space.

If you are going to offer a home or personal version of software (as opposed to professional), make sure it still does everything the user needs to a degree. You don’t need to include every bell and whistle, but make sure it is usable and doesn’t reflect badly on you and therefore dissuade them from upgrading to the professional version. Also make sure the two versions are clearly marked and the user is aware that there is a professional version that comes with additional functions. Otherwise they might just think that this is the product and the one offered by your competitor for a little more cost has more features.

European experiment could destroy planet

Which planet? Earth that is. In Europe there is an ongoing court bid to halt turning on CERN’s Large Hadron Collider. The collider is the worlds largest particle accelerator running underground at the Franco-Swiss border in Geneva with a circumference of 17 miles. The concern is that when used, the collider will generate black holes which could grow exponentially to swallow the earth.

Of course, this isn’t the first time the planet has faced such danger. When I was 10, I tried to build my own particle accelerator which failed more miserably than the laser I tried to build from a cardboard tube, aluminum/tin foil, a 2 liter coke bottle, some vinegar and baking soda. it was going to be a carbon dioxide laser since they were used in manufacturing to cut through sheet metal, and at 10 years old, the first thing you want to do with the laser you just built is cut stuff up (also, at the time I couldn’t afford gems to excite the particles). I spent a lot of time as a kid reading my older brothers physics books which was great for my education but not so much for my bedroom.

Regarding the LHC, a safety report from 2003 verified that there were risks of micro-black holes, among other phenomena, and that there was no basis for any danger. This report was reaffirmed in 2008 by the LHC’s Safety Assessment Group.

CERN will be flicking the switch on September 10th 2008, I wish them more luck than I had with my particle accelerator, or as I affectionately called her “Ol’ Lightning”.

Updated 9/13/2008 : We’re all still alive!

Is Spring between the devil and the EJB?

Reading this post on Javalobby prompted me to go and dust off a post I wrote a while ago but hadn?t published regarding Spring and the revitalized EJB standard. At the time I was fired up by this post by Rod Johnson which seemed to be a large helping of FUD and insults. Nonsense such as suggesting that because some people were using app servers and some weren?t the age of the app server was over, like suggesting that because I want a shovel to dig a hole, we no longer need backhoes. This was interspersed with some irrelevant quotes from Gartner made to look like evidence and malicious comments about EJBs and their users. It seemed like the Spring folks were chomping at the bit to pronounce EJB dead when in fact, as evidenced by some recent posts, it is very much alive. In hindsight, it seems the Spring guys were trying to lay some marketing groundwork prior to releasing their own OSGI application server.

This brings me to this latest post, one of a number of recent posts which sings the praises of EJBs and in this case asks the Spring developer “why not?”. It’s almost like the question nobody asks because the presumption is that the answer is obvious. It also touches on the issue of Spring and EJB developers not getting along which I think in part was fueled by the old arguments of Rod and Gavin who seem ‘passionate’ about their technology choices. However, there is still some animosity between the two camps years after those minor flame wars. I think part of it stems from the normal response of users being defensive, and therefore offensive or protective of their technology of choice because of flaws they are aware off even if they disagree with them, which is a normal response.

Disclaimer : I’m currently working on a Seam project and have been involved on the Seam forums. However, when I need a quick dependency injection library (especially for SE), I turn to Spring.

EJB users are having to defend a technology which has the appearance of being stodgy and has a terrible legacy even though its modern day incarnation is far more hip, cool and even Spring-like. Few negative comments about EJBs appear to be about flaws in the current implementation other than the fact that EJBs require a container.

The Spring users have to defend a technology that is in essence proprietary as opposed to standards based, and while the core Spring functionality (DI, AOP) is very good, a number of people believe that it is starting to spread itself a little too thin, and starting to suffer from the dreaded ?Bloat?. It is now facing competition from EJB, a technology that is not only as easy to use and powerful but is also a standard, which, all things being equal, is a positive. If nothing else, being a standard will also give it a helping hand in being adopted in some of the more corporate shops.

While SpringSource haven?t implemented the standards, I?m sure there will be a Spring driven implementation of Web Beans (JSR 299) which could drag Spring kicking and screaming under the standards umbrella. If Web Beans gains traction and becomes the accepted way of defining components for web applications, then there is a chance people will choose the standards based web beans syntax over a proprietary Spring syntax and be able to swap out implementations. One advantage Spring does have is the ability to provide its core functionality on both desktop and web applications which unfortunately, isn?t a part of the Web Beans spec (yet?). This may provide enough reason for developers to avoid using Web Beans or at least limit it to pieces that will definitely be web based only.

I do like the fact that the only opaque part of Spring is the container. Other pieces like the transaction manager, data sources and so on are all transparent for you to see in your configuration unlike the EJB container where they are just bundled in and magically mess with your beans. This lack of apparent simplicity can also be a turn off for some people who prefer a simpler Spring solution over complex old EJBs.

In some ways Spring feels like that small cafe that worked really well, was cheap and served great food compared to ‘those chains’. They decide to open a couple more restaurants up, and the owner can’t run all of them so he hires extra help, and trains them, but they don’t always get it right, and lack the enthusiasm with personal service. He opens a couple more stores up and decides to produce a manual detailing every aspect of the recipes and customer service. Before he knows it, he is one of ‘those chains’, and the quality of food has gone down, and the prices have gone up. Not that I think every Spring project is prone to fail unless it is under the guiding hand of Rod. However, Spring has spread beyond it’s core functionality and expecting the same level of buy-in from developers, and from Rod’s post referenced at the beginning of this post, it seems they are even trying to Manufacture buy-in.

At one time, the Spring team would have criticized the inability to move a ‘standard’ EJB from one app server to another, now they just expect you to deploy applications in their proprietary modules for their app server. They would have criticized the bloat of the implemented standards, and now if you want to use their web flow API, you have to include their Web MVC framework even if you are using JSF. I think this is a bit of a reach from the SpringSource folks. Just because I put my Dependency Injection egg in your basket, it doesn’t automatically mean I’m going to put my view technology and server choice eggs in there too.

Java Posse Hits 200th Podcast

The chaps over at Java Posse recently celebrated their 200th Podcast with a retrospective of their first 200 episodes since they started back in September 2005. In the last 3 years they have provided some great round ups of Java news and interviews as well as adding their own (usually!) informed perspectives. These guys have been a somewhat diverse group of developers considering they are all Java developers, often including news that is relevant to, or on the cusp of Java development. They often give their opinions on the state of the industry, alternative technologies and related technologies. Having been away recently, I’m a few episodes behind so I have a few to catch up on.

Great stuff chaps and here’s to another 200 episodes.

I’m not dead yet

OK, I know in my original post I tempted fate by suggesting that my Hello World post might end up sitting in solitary on my blog, forever archived on google or the memory hole. However, I have been fairly busy.

I’ve been working on creating my own WordPress theme which was interesting. It’s always fun taking a blank structured HTML page and using CSS to organize and bring it to life and hopefully making it look good. This page gives you a great introduction to WordPress themes starting from the very basics and building up from there. Some items needed tweaking to work ok, but once I got the basics down, I started fresh (like the article does) with my own css layout and built my pages from there. It still needs some work, and it still needs some visual tweaks but overall I like it. Of course, next month I might just decide I hate it.

I’ve also been working on a rather long article comparing Spring Web Flow and JBoss Seam. I took the easy option and started writing it using MS Word thinking I would just port it to HTML or print to PDF. However, after a while it got rather large, and I started wondering how easy it would be to port to HTML and do I want to limit it to PDF? The answer I found was in using docbook to produce the document. Docbook is an xml like markup language for defining technical documentation. This page offers a great tutorial on setting up an eclipse project in which to edit and build your docbook source into HTML and PDF targets.

Regarding the article comparing JBoss Seam and Spring Web Flow, I am still working on it. It is complete except for the conclusion, although I am still proof reading it, and have been trying to do this over the past month or so. It attempts to draw some comparisons that are very relevant to the typical CRUD code that developers write every day, and avoids the simplistic Hello World projects that are often used to demonstrate and compare technologies. The article(s) compare frameworks in the context of a somewhat real world project involving CRUD, and master detail relationships. I’m hoping to have this completed in the next few weeks.

Is Microsoft digging in the wrong place?

Java developers might feel a sense of unease relating to the number of posts predicting its demise at the hands of .net. However, developers have evolved; at least I know I have. Looking back on my Delphi days, we used to couple visual components to datasets that were then coupled to the database by a SQL statement. Insert, updates and deletes were made using cached updates with SQL statements that were generated from the select SQL. For the most part, unless you were really aware of the issue, transactions were ignored and despite it being an object-oriented language, it made rather shallow usage of the methodology.

Handling validation, corrections and formatting of fields was always a tricky affair and involved data validation code scattered throughout the application. In most cases the code was stuffed into event handlers and reproduced each time it was needed.

Over time, maintenance became a nightmare. If a field name, type or size changed it would break the dataset that was bound to that table or field. It meant having to go and manually refresh the fields on each dataset, and some changes meant having to manually modify each SQL statement that involved a particular table. For the last few years, most of my Delphi gigs have involved maintenance of Delphi applications and each time I was presented with spaghetti code where logic was strewn throughout the application in the unlikeliest of places.

One problem with Delphi is the ease with which applications can be created using a point and click interface, which is always a good thing, except for the fact that it leads to casual and lazy programming. (Un)Fortunately Delphi was never so popular that the masses picked it up, otherwise we could be swimming in a sea of bad code instead of a small lake. I like to think that those who did adopt Delphi as their language of choice took pride in it enough to do a decent job although obviously not everyone did.

Having seen and worked with Java, it is like coming out from under a rock and programming like an adult. We have frameworks that use good OO principles as often as they use ?if? statements. We can use rich ORM models to decouple the application from the database, and we use other layers and Dependency Injection to abstract everything from casual data access to complex processing so we can easily refactor at will and switch out one concrete class for another. Our domain models decouple our view and our logic from our database. The web frameworks became more plentiful and feature rich, and while it gave the developer a dizzying choice, it showed that Java folks were being creative and innovative.

Despite the choice of Java on one hand, and Delphi on the other, Microsoft chose the Delphi style of development. They chose tight coupling between data, logical and view tiers with developer tools that encourage point and click coding and discourage re-use. They chose the route that promotes lazy development and takes the user down the route of spaghetti code. I think Microsoft could have fully embraced true OO values and built it into Visual Studio.

Microsoft has taken steps to amend this with the introduction of the ObjectDataSource in .net 2.0 which allowed developers to couple their controls to lists of objects. They have also released their own action based MVC framework, which puts them only about 10 years behind java, which in software development terms is a lifetime.

There are also efforts to port the best java libraries over to .net such as spring.net and nhibernate that may encourage people to adopt better practices. I have used Nhibernate with asp.net and found it works very well, but I worry that the lack of a ?Made By Microsoft? stamp will keep it on the sidelines in most corporate shops. There are also many developers who will move to .net from VB and Delphi and continue their bad practices without thought and forge new shackles for themselves in a new language.

ASP.net pages are already being shown the door for Microsoft MVC, Winforms has come and gone in favor of WPF. If developers had separated their layers, they can put a new view layer over the business/data layer rather than have to redo many pieces with a new front end. I don?t doubt for one minute that this will be the last major upheaval to .net in the next decade. It seems like few pieces of Microsoft technology has a practical lifespan of more than a few years. Most articles promoting good practices in .net involve shallow OO principles and like Delphi, are relegated to the strange and esoteric. Why write code when I can point and click?

Looking at Delphi?s past and present, I see a future for .net with an abundance of legacy apps that require someone to jump in and untangle all that spaghetti code. It may pay the bills, and in some cases pay them well, but that doesn?t always make for relief from the misery (as a Delphi developer, I would know).

This Delphi programmer has evolved, but it seems that Microsoft has not yet made the leap, and neither have a number of their developers who keep digging in the wrong place.

Ok Saul, go bust a move, but watch out for the dead monkey.

Update : An interesting discussion on this topic has developed in this post on Fredrik Norm?n’s blog.

Hello World

To paraphrase Christopher Hitchens, the truth usually falls between two opposing views and the only way to influence the determination of that point is by getting involved, having your say, refuting a few arguments and maybe changing a few opinions. Now I’ve finally found the time to create my blog, maybe I can do just that. Here’s hoping it doesn’t become just one of many derelict first post blogs with a handful of temporally divided posts.

It is somewhat poignant that today is the date of my first post given my history. I started writing software professionally 13 years ago, in the summer of 1995, at a company that wrote tax software. My boss came in, threw me a brand spanking new, hot off the presses, only just released copy of Borland Delphi version 1, and told me to learn it. It was Windows 3.1 and had its problems, but it was awesome and I soon become a Delphi guru.

That event has forged my life for the past 13 years that I have spent working as a Delphi developer. It’s been a rough road, in the late 1990’s I worked as a consultant waiting on a daily basis for the Delphi jobs to trickle through my email box. Most gigs have been unattractive maintenance jobs usually providing support until the client can write it in something else. I’ve been through the Inprise name change, I?ve been through the lawsuits and employee stealing between Microsoft and Borland, and I?ve been through the Corel fiasco, the Kylix fiasco, the CodeGear fiasco, and felt every other trip and fall that Borland, and especially Delphi had in the last 13 years. I paid good money out of my own pocket for Borland Delphi 2006, and it was a stinker. I’ve been in the same job for the last 8 years, since July 2000, working on a few Delphi projects, and it’s time for a change.

I aim to blog about a number of these changes, the artifacts and events that have precipitated it and the path I’m heading in, but for now, let’s just say that it’s time to push in new directions. This blog is really a symbol of that change, because I intend for much of this blog to focus on my new directions, and on this same day Code Gear has been sold to Embarcadero for $23 million. Now I wish them all the best, but Delphi has been the unloved stepchild of Borland for a number of years and I don?t see any small amount of investment fixing that even if they chose to make the effort to. In a day and age where IDE?s are free and platforms are competing more than ever, I don?t see a small shareholder of that market staying in the race for long.

It seems fitting and ironic that on the day I embark on new ventures, news comes out that will probably ultimately end up with Delphi being put out to pasture.

For me, Delphi is another Betamax, a far superior technology that just couldn?t get a good enough footing to reach critical mass. It spoilt me as a developer, no other platform or tool had the richness, or allowed for such development speed with strong third party and community support. It wasn’t all pretty, and some of it’ll write about, but pound for pound, Delphi was the tool, the language and the framework to which all others aspired.

I’ve been looking for new directions to take, and coming from Delphi, it is difficult for me to find tools and a language that stands up to the standards I’m used to. However, I have found a rather solid contender in the form of JBoss Seam which comes with an IDE that, rather than be a mish-mash of mis-functioning plugins, is actually geared towards developing Seam applications. Seam also provides much of the richness of developing a thick client application, and getting a project started isn’t a project in itself. For now, Seam appears to be put the RAD into web development.

There is always .net development, which coming from a Delphi background should feel like home. I’m currently working with Java at the moment, but I am sure I’ll be getting back into Microsoft territory.

So here’s to new ventures and old friends.