Nerding It Up, Part Two: The Results

So before I get to the actual results of who won, I wanted to go over some of the numbers, in part because I’m a giant dork who loves spreadsheets, and in part because since this is the first one of these things we’ve ever done, I wanted to see how the stats affect the game. In fantasy sports you’re drafting based on future performance, whereas with the Fantasy Film Draft we were drafting based on results that were already available. Heading into the day, it was pretty clear that both myself and Adam had done significant research on the numbers ahead of time, with Fish and a couple other guys doing some, and the rest of the guys essentially winging it.

For the draft to be a viable thing going forward, it needs to be something more than just everybody grabbing the stats ahead of time and taking the next movie on the master list when it was their turn to draft. That would be the opposite of fun. There needs to be *some* sort of variable for it to be worthwhile to do again. So what follows is a bunch of stuff I thought was either interesting, important, or just kinda fun. Do with the info what you will. I’m sure we’ll have some form of nerd conference before we take a stab at another draft and some of this junk will be useful.

First up, let’s just look at some results that are specific to this draft only and wouldn’t necessarily carry forward to the next one, given the specificity of the draft targets.

Total Score By Director
1. Steven Spielberg (1244)
2. Martin Scorsese (1865)
3. Francis Ford Coppola (2242)
4. Quentin Tarantino (2466)
5. David Fincher (2589)
6. Ron Howard (2666)
7. Joel Coen (2781)
8. Rob Reiner (2876)
9. Tim Burton (2977)
10. Oliver Stone (2996)
11. Wes Anderson (3093)
12. Spike Lee (3352)
13. Michael Bay (3390)
14. David Zucker (3478)
15. Kevin Smith (3994)

Total Score By Wildcard Decade
1. 1970’s (1057)
2. 2000’s (1198)
3. 1990’s (1283)
4. 1980’s (1714)
5. 2010’s (1853)

How good is Spielberg? He’d rank 3rd overall when compared to the decades, the only director to crack their midst.

Top 10 Overall Films
(draft position in parentheses)
1. Lord Of The Rings: The Return Of The King (14th – Craig)
2. Star Wars: Episode IV – A New Hope (5th – Andy)
3. The Godfather (4th – Adam)
4. The Lord Of The Rings: The Fellowship Of The Ring (48th –Gordon)
5. Raiders Of The Lost Ark (52nd – Adam)
6. Forrest Gump (12th – Andy)
7. Saving Private Ryan (64th – Gordon)
8. Schindler’s List (10th – Bill)
9. E.T. The Extra Terrestrial – (9th – Fish)
10. The Dark Knight (20th – Adam)

Again it’s Spielberg the big winner, as he places 4 films in the top 10, while Coppola netted 1 and the other 5 were all from the decades bin. You can already see that having the numbers in advance paid big dividends to me and Adam as we grabbed the films with the biggest value among the 10. I’ll explore that concept further in a bit.

Bottom 10 Overall Films
151. Flipped (142nd – Craig)
152. Big Eyes (122nd – Bill)
153. Bottle Rocket (155th – Mark)
154. Jay And Silent Bob Strike Back (156th – Andy)
155. Scary Mary 4 (146th – Jud & Mike)
156. Wall Street: Money Never Sleeps (83rd – Craig)
157. Zack And Miri Make A Porno (157th – Adam)
158. Mallrats (152nd – Fish)
159. BASEketball (154th – Bill)
160. Tusk (151st – Bill)

Not really any surprises here, and not much you can do about it. About the only thing I take from these numbers is that you were better off getting the best films from the weaker directors, since their crap really dragged you down.

Now let’s take a closer look at the picks that gave the best real value. Here I’m comparing the position the film was drafted versus where it actually ranked in the final tally. A negative number here is good (rank –draft pos = value).

Top 10 Value Picks
1. Gravity (-143 – Gordon)
2. Patton (-130 – Gordon)
3. American Beauty (-121 – Gordon)
4. Finding Nemo (-117 – Andy)
5T. Slumdog Millionaire (-108 – David)
5T. Back To The Future (-108 – Gordon)
7. Tootsie (-90 – Mark)
8. The Silence Of The Lambs (-85 – Mark)
9. Rain Man (-78 – Jud & Mike)
10. Hugo (-62 – David)

What jumps out here is that the first 9 films on the list were Wildcard picks, as only Scorsese’s “Hugo” made the cut. I got 4 of the top 5 value picks, thanks to my strategy of leaving the Wildcard picks until the end. Adam on the other hand, despite also having significant data, blew his Wildcards early, which left him scrambling for decent films from the chosen directors. He realized his mistake midway through the draft and wa skicking himself at the time.

Here are the top ten value picks for Director’s only.

Top 10 Value Picks (Director’s Only)
1. Hugo (-62 – David)
2. Saving Private Ryan (-57 – Gordon)
3. The Wolf Of Wall Street (-51 – Jud & Mike)
4. Raiders Of The Lost Ark (-47 – Adam)
5. Kill Bill: Vol. 2 (-43 – Adam)
6. Charlie And The Chocolate Factory (-42 – Craig)
7. Gone Girl (-40 – Andy)
8. Lincoln (-36 – Jud & Mike)
9. The Curious Case Of Benjamin Button (-34 – Jud & Mike)
10. The Girl With The Dragon Tattoo (-33 – Gordon)

Adam fared better here, as he was able to grab good value thanks to having the stats, whereas other guys were drafting those director’s films on reputation rather than raw data. Props to Jud & Mike though, who were essentially coming in blind, yet still grabbed three of the top 10. Given that they joined the group only a few days before the draft and didn’t fully understand the rules until they got there, I’d say they acquitted themselves well.

Bottom 10 Value Picks
150T. The Cotton Club (53 – Mark)
150T. Transformers: Revenge Of The Fallen (53 – Craig)
152. Reservoir Dogs (54 – Mark)
153. Wall Street (67 – Bill)
154. Kill Bill: Vol. 1 (68 – Craig)
155T. Clerks II (72 – Jud & Mike)
155T. Bram Stroker’s Dracula (72 – Andy)
157. Wall Street: Money Never Sleeps (73 – Craig)
158. How The Grinch Stole Christmas (79 –Andy)
159T. Clerks (81 – Craig)
159T. The Godfather: Part III (81 – Jud & Mike)

About all I can take from the bottom 10 is that Craig tended to reach a lot. For instance, Kevin Smith’s films, based on the pre-draft numbers, were all kind of bunched together and therefore didn’t represent much overall value in the game. Craig taking “Clerks” relatively early ended up costing him, because he didn’t wind up with all that significantly better a film (scoring wise) than the guys who waited till almost the end to grab a Smith film.

Continuing on the value front, I compared where a film ranked amongst the 8 other films in its director pool versus where it was drafted among the 8, to see which films might have been out of whack. Below are the good value/bad value results that were more than 2 spots different. Not all director’s had such an instance.

(First number in parentheses is where it ranked among the director’s 8 films, second number is where it was picked among them)

David Fincher
GOOD: Gone Girl (2/5 – Andy)
BAD: Fight Club (6/1 – Bill)

Joel Coen
BAD: Raising Arizona (8/5 – Craig)

Kevin Smith
BAD: Tusk (8/5 – Bill)

Martin Scorsese
GOOD: The Departed (1/4 – Gordon)

Michael Bay
BAD: Transformers: Revenge Of The Fallen (6/2 – Craig)

Oliver Stone
GOOD: Salvador (5/8 – David)
BAD: Wall Street (6/3 – Bill)
BAD: Wall Street: Money Never Sleeps (8/5 – Craig)

(To be fair, Salvador didn’t have a Metacritic score, so it got the avg score of 72, whereas Wall Street had a 56, which is probably not an accurate comparison)

BAD: Kill Bill: Vol. 1 (6/2 – Craig)

BAD: A Few Good Men (5/1 – Jud & Mike)

BAD: How The Grinch Stole Christmas (7/3 – Andy)

GOOD: 25th Hour (4/8 – David)
BAD: Clockers (7/4 – Bill)

GOOD: Raiders Of The Lost Ark (1/4 – Adam)
GOOD: Saving Private Ryan (2/5 – Gordon)
BAD: E.T. The Extra-Terrestrial (4/1 – David)

GOOD: Charlie And The Chocolate Factory (4/8 – Craig)

Once again, Adam and I, with all our advance research, generally were able to grab top films from each director regardless of where we drafted, whereas Bill and Craig, who were going by feel, suffered.

Alright, that’s all the bonus stats I could be bothered to come up with. Obviously doing the actual statistical research ahead of time made a pretty big difference, although that knowledge wasn’t foolproof, as you still needed to use it with a proper strategy (sorry, Adam).

So here are the final overall scores…

1. Gordon (4608)
2. David (5525)
3. Andy (6031)
4. Jud & Mike (6100)
5. Adam (6116)
6. Mark (6333)
7. Bill (7183)
8. Craig (7218)

So I ended up winning by a pretty comfortable margin, which basically reinforces what I’ve been saying about having the stats ahead of time. I may not have always chosen the right director at the right time, but when I *did* draft a director I was always taking the remaining film from their catalogue that had the highest score for our needs.

Going into the draft I researched the scores for 391 different films: the 229 eligible for the 15 chosen directors (minus a couple of *really* small films), 29 for the 1970’s, 23 for the 1980’s, 40 for the 1990’s, 41 from the 2000’s, and 29 from the 2010’s. When the draft unfolded, 7 films that weren’t a part of my 391 were chosen.

Alien 3 (A Fincher film I somehow managed to overlook)
Kramer Vs Kramer (never popped up my radar, as I didn’t dig too deep on Oscar noms)
Cinema Paradiso (I dismissed foreign films from my research)
Rain Man (not sure how I missed this one)
Oldboy (again, foreign film)
Avengers: Age Of Ultron (I didn’t bother with films from 2015)
Furious 7 (same as Avengers)

Fincher’s “Alien 3” ended up ranked 6th among his films, which could have cost me, but I grabbed his 5th ranked film (“Girl With TheDragon Tattoo”)., “Kramer Vs Kramer” was 8th among the 1970’s Wildcards, “Cinema Paradiso” was 8th among the 1980’s, “Oldboy” 8th among the 2000’s, and “Avengers: Age Of Ultron” and “Furious 7” ended up 7th and 8th among the 2010’s, basically justifying my decision not to research any of them. Only “Rain Man”, which wound up 5th in the 1980’s, could have hurt me, but I wound up with the #1 ranked film from that group, so I was okay.

So what the hell does all of this mean? In terms of the game going forward, it means that – surprise, surprise – doing all of the stat research ahead of time greatly improves your chance of winning. It also takes a lot of the fun out of it. But it’s still not without its flaws. For instance, in my pre-draft rankings, “Dragon Tattoo” was 4th and “Se7en” was 5th for Fincher, but they switched places in the final scoring. Same with Coppola, as “The Godfather: Part II” and “Apocalypse Now” switched the 2nd and 3rd spots on his list, while The Godfather: Part III” and “Dracula” flip-flopped between 5&6. “Hugo” actually jumped from 7th to 5th for Scorsese. Since my pre-draft rankings were based on all 391 films I had researched, some films were going to get better or worse when we narrowed the field to the final 160. So the idea to base scoring on where films rated compared to their fellow draftees provided some degree of variation, which is encouraging. It basically means that even painstaking, accurate draft prep still won’t yield guaranteed results.

Before I go, I wanted to take a look at one other way of figuring the final results that I thought might make a difference. In looking at the standings above, you’ll notice that the final totals are all in the mid-to-upper thousands. Obviously this leaves a lot of room between, which isn’t much fun for competitive purposes. So I thought I’d re-look at the scores more like they do in fantasy baseball. Rather than total up the scores for all the films each studio has, I totaled up each category. So rather than ranking 1-160 for films, we were only ranking 1-8 for studios. The results were pretty interesting…

1. Gordon (10)
2. David (11)
3T. Adam (17)
3T. Andy (17)
5T. Jud & Mike (18)
5T. Mark (18)
5. Bill (24)
8. Craig (29)

In these new standings, myself and Fish remain at #1 and #2 overall, but Adam moves up from 5th to a tie for 3rd, and Jud & Mike drop from 4th to a tie for 5th. That’s some pretty significant movement, and instead of me winning by a substantial margin, I narrowly squeak out a victory over Fish by 1 point. Needless to say this is a highly more interesting way to look at it and makes the draft much more crucial. Had we all been using the same data on draft day, there’s no telling how it would have turned out. My thinking is this is probably the way we should do it going forward.

So there you have it, boys. It was an interesting experiment, one I’d be willing to try again someday, given that it looks like there’s room for enough variation even when you already know the basic “scores” for each film going in. Changing up what “positions” we draft (be it different directors, actors, films by year) would also seem to keep us guessing somewhat. Anyway, thanks for coming along with me on this crazy idea. It was fun (especially, you know, since I kicked all ya’lls asses).


P.S. – Since I already have them sorted, here’s how we ranked per studio in the 4 major stat categories, which is what I used to come up with the Alternate Final Standings.

1. Andy
2. Adam
3. Jud & Mike
4. David
5. Mark
6. Gordon
7. Craig
8. Bill

1. Gordon
2. David
3. Bill
4. Adam
5. Andy
6. Mark
7. Jud & Mike
8. Craig

1. Gordon
2. David
3. Mark
4. Adam
5. Bill
6. Andy
7. Jud & Mike
8. Craig

1. Jud & Mike
2. Gordon
3. David
4. Mark
5. Andy
6. Craig
7. Adam
8. Bill


2 thoughts on “Nerding It Up, Part Two: The Results

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s