The Problem with Year-end Top Albums Lists

The Problem with Year-end Top Albums Lists

It isn't meaningful to rank 50 buzz records in arbitrary orders
As 2011 creeps to a close, we'll see increasingly frantic holiday shoppers struggle to fill their gift lists. We'll see dropping temperatures and shortening daylight hours and giddier kids. In the music world, we'll see every kind of writer, from high-profile self-described critics writing for magazines to the self-made, near-anonymous bloggers, penning their lists of top albums of the year.
 
It's started already. Readers get excited like kids waiting for Santa. They want to know which of their favorite records scored the list, want to feel justified where their tastes line up with the top journalists and outraged where they diverged. Music writers know this. They know how people read these lists. I get the sneaking suspicion they construct them with subtly manipulative tactics in mind. It's their job to get people reading and talking, after all, not necessarily to evaluate the objectively best music released throughout the year. So some inclusions will be predictable, some will be out of left field. Some good records will be left off just so people can get upset about their omission. Largely, it's a fiasco. Largely, it's more akin to a VH1 countdown than actual music journalism.
Pitchfork probably had the first big year-ending list, just because Pitchfork essentially created the music blogging scene as we know it. And plenty of people follow suit. You're likely to see tens upon tens of lists across the internet, all with the same records in slightly different orders. Maybe you still take it seriously. I did, once. Now? Now I'm skeptical that there's even really a difference between Pitchfork's 17th best record of the year and Pitchfork's 18th best record of the year.
 
I mean, come on. A list of 50 records might as well be a list of 500 records. Are you really going to claim that only 50 artists came through with work good enough to merit a gold star? These numbers, these rankings, they don't mean anything anymore. Especially with the scope of genre covered by the great majority of hot blogs. If you really want to say something meaningful, divy up your list by type. Give me your favorite glo-fi record, the best of 2011's chillwave, the top hip-hop, the most innovative rock n' roll. Don't rank them in a neat little row. Just acknowledge what they've contributed and move on. Lists of personal favorites? Totally cool. You acknowledge your subjectivity, you mention why each record affected you, you write a good article. But seemingly objective "best-of" lists are where I get annoyed. If you really want to investigate music critically, you don't need to make preferential rankings without comment. Dive into the why.
 
Oh, and stop acting like Girls is a good band. I promise you: in four years, everyone will be totally embarrassed they ever listened to Girls. Promise.