So now that the submission period is over, does anyone know how they are judging these maps? Are the pictures just spam protection? Do they actually watch the films? Do they give every map a fair playtest? Do they have sponsors for each category test the maps first? I was thinking they go about it as follows: Check to make sure if it has 3+ screen shots and one film, if so, they download the map and ignore the film/pictures (spam protection, poor entry protection). After DL'ing the map the 'sponsor' for that category would check over the map; bumps, z-flashing, general poor quality architecture, spam. If the map is spam or clearly unplayable, it would not move on to the next phase. I believe after making sure the entry is legitimate/playable, they would test one to five games (this is the sponsor, not Bungie). They would write a review about gameplay (solid? vehicles? sniper op?), enjoyability, and aesthetics and include a label; Recommended, Not Recommended, and Highly Recommended. Bungie would play on Recommended maps after reading the review and Highly Recommended maps without reading the review. Not Recommended maps would be given a chance based off of the review. And that's how I think Bungie's testing the maps. I hope they tell us for real, because speculation is dangerous and kills dreams.
i doubt it. my theory is that they have a team of people sifting through everything they've received. having the 3 pics + film is the first safeguard. if you have that, you are passed onto the next team, the next team watches films briefly, looking at the maps and seeing how they play. if something looks decent, it gets passed on to a team who actually plays the map. after that, it gets filtered through. at least, thats my theory. theres no way they are actually going to play them all. there's just too many.
I agree with Titmar, the filtering process is probably brief on a per-map basis, and 90% or more would be eliminated without any kind of test. The flood of obvious crap and mediocrity far outweighs the stuff with a reasonable chance of winning, and it should be fairly easy for an experience eye to spot. (And that's not including the submissions that aren't tagged or grouped properly, don't have screenshots or a vid, and so forth.) It might be nice if they gave everything more of a fair shake, but you have to understand that the number of submissions is probably in the thousands or tens of thousands, and they have way, way too much stuff piled up there to bother testing anything except the slimmest percentage.
unfortunately, i think its also inevitable that some really great maps are going to get overlooked too. i wonder if the winners for this Forgetacular will just be the first round of maps going into MM? Like, suppose at the end of all their judging process, they have 200 maps that they have decided are worthy of MM. 200 maps is too much for one playlist. Too much even for 5 playlists. Maybe they'll introduce the first 20 or 30 in the first playlist update, and then a few months down the road they'll bring more maps in? That would be amazing. Whatever happens, whoever wins, what i hope for the most is that the playlists are amazingly kickass. I hope its like at least 10 maps per playlist.
Yeah I'd like to see that too. There has to be a lot of matchmaking-worthy maps out there, probably too many for active playlists to support - but surely more than the few meager entries they are planning to add in.
It would be very awesome if they added in maps throughout the year, so that matchmaking is always a fresh experience--no need for expensive DLC here! Also, while my theory may seem unrealistic, this is given they have a large team for each category (sponsors). I feel like in that way they could easily play on every map.
Tbh, I think, they should add a maximum of 3 maps per playlist every fortnight, so we actually have a chance to learn them and where weapons are etc. If we had 200 maps just thrown into each playlist, you would never get the chance to learn the maps, and maps are boring if you don't understand weapon placements and tactics.
Given that Atom went through a month of playtesting, it's very possible that Bungie had winners already in mind before even the end of the contest. The fact that they mentioned that there were other submissions they took interest in, tells me that they saw some last minute submissions that they liked, and that they are going to try to squeeze a few extras in with the winners. I find it hard to believe that their opinion on a winning map, would be a last minute thing. But, for the sake of everyone who entered, it would be good PR to make it look that way. I also suspect, that they use proxy gamertags and often join popular community Test processes. There are a few individuals that stand out to me, and through further research I have discovered that some of those individuals do in fact have some ties to Bungie employees, whether they are on their friendslist, or have played with them recently. One thing I am almost certain of, is in tomorrow's WU we will hear something relevant from Bungie, whether it is announcing the winners, or that they have simply picked their choice for winners, and are in the process of contacting the participants.
You raise some interesting concerns here. The idealist in me begs that such a thing won't be and that the contest truly is legitimate... But the realist in me wouldn't be surprised if its true. I'm just hopeful that the winners aren't anything similar to the garbage rascal cat puts out in the news feed. Regardless, tomorrow night should contain some worth while news.
I don't think there's anything shady about their practice. There are a lot of submissions, and the extreme number of last minute submissions was probably something that was underestimated. Someone recently posted the tally of submissions tagged in the Reach server, the list was; Invasion: 287 Race: 59 Team Slayer: 504 BTB: 296 CTF: 257 Infection: 95 Arena: 360 Total map count: 1,858 That's quite a large chunk of maps. In theory, if they tested every map, and the average length of a match is, say 7 minutes, give or take. That's over four hours of playtesting every map. This doesn't take into account how many maps were choppy, broken, or just below expectations. It also doesn't take into account how many participants didn't read all of the fine print in the contest rules and regulations. This excludes people outside the U.S. and allows only legal residents over the age of 17. I imagine a lot of the younger people skimmed over that part. Similarly, how many of the maps are remakes, either from other Halo games, or elsewhere? Amongst all this chaos; *They still have to pick at least 7 winners for each category. *Playtest these maps thoroughly *Build a community playlist for these maps *Contact the winners *Write an update *And finally code the maps into Matchmaking in the community playlist (possibly Maption Sack?), and I imagine they will also squeeze the maps into the playlists in which they were intended.
I hope they are not aesthetic nazis like here... I mean, here, laggy (but pretty maps) are the ones that get always the attention...
To be honest, I don't think that there will be THAT many maps that clear the MM worthy bar. I think 5-7 maps per category will actually make the cut, and then it comes down to personal taste, layout, and it's uniqueness in comparison to other maps already in MM rotation. I could be wrong, but I've looked at a lot of maps, and while some may be visually appealing, the spawns and small nuances don't add up to something you'd see in MM. Unless Bungie will be notifying people of small adjustments to make to certain maps, I don't expect to see a LARGE amount of maps added into the mix. Game play and frame rate will definitely come first to Bungie. However, a map that nails the first two and is visually appealing will beat out something that's a bit more bland. It's actually somewhere around 217 hours to test every map, assuming the 7 minute testing time and map totals are accurate. That's IF they test every map, which they obviously won't. Even still, that does put a good amount of perspective on why the contest results have been delayed.