Monday, July 02, 2012

Sometimes it's the little things in programming that surprise the most

I need to brush up on Core Data, I've got bugs to solve and I haven't used Core Data for anything real yet, so best to school myself.

I've got iOS Programming: The Big Nerd Ranch Guide 2nd Edition with a chapter on Core Data and Core Data: Apple's API for Persisting Data on Mac OS X to choose from to get started. I chose iOS Programming because it's just a chapter, should be more focused if I can't get all the way through Core Data…

I'm reading through, getting reacquainted with the terminology when I see something that strikes me as odd. The sample is talking about how entities in Core Data don't automatically save their order relative to other entities and how you have to create an attribute yourself to manage this. Authors Joe Conway and Aaron Hillegass call it orderValue. No big deal, done this countless times. Then the authors do something completely unexpected. Instead of using an integer for the attribute, they use a double!

I literally think "huh, that seems strange, I've always used an integer..." and keep reading. A few paragraphs later, they explain why they chose a double. If the entity's position is changed, with an integer you have to change all the other entities orderValues. With a double, you just find the orderValue of entities in front and behind the entity that got moved, then add them together, and divide by two. Thus, the new orderingValue will fall directly in between... As soon as I read that, my mind did something very similar to this moment in Pixar's Ratatouille when critic Anton Ego has a childhood flashback after trying the titular food (starts at :15 into the clip).

 Of course instead of food, I flash backed to all the code I ever wrote using an integer to solve that problem. Correction, all the wrong code I wrote to solve that problem.