Fun Fact: Googling for "recursion" will give you a message asking "Did you mean recursion?"
I had a fairly surreal experience today. I was watched a trailer advertising an upcoming video game. While watching it, an advertisement popped up on the bottom edge of the video (one of those banner ads you click the X to close).
Ladies and gentlemen, advertising has now gotten to the point where they put advertisements in your advertisements so you can see advertisements while watching advertisements.
20100430
20100429
Messaging
To put it simply, I like being talked to like I'm an intelligent human being, generally because I regard myself as such. Thus, when Republicans, Democrats, and blue furry creatures from Alpha Centauri go on the news spouting emotion grabbing nonsense I am insulted and tune out.
Thus, I like Republicans like Ken Blackwell who are well-spoken, willing to engage facts, and able to have an actual discussion/debate on the issues. I may not agree with them, but I feel much better about the prospect of such people having a role in government when they aren't obsessed with blatant fiction (or completely unable to disperse one, what the heck Democrats?).
Thus, I like Republicans like Ken Blackwell who are well-spoken, willing to engage facts, and able to have an actual discussion/debate on the issues. I may not agree with them, but I feel much better about the prospect of such people having a role in government when they aren't obsessed with blatant fiction (or completely unable to disperse one, what the heck Democrats?).
20100426
Bronze
I am good at video games, a fact known by anyone familiar with me. This fact is responsible for two traits of my character I am very knowledgeable about, and one I had little or no idea existed until very recently.
The first two are straightforward, though I won't go into deep explanation. To summarize, my stubborn, never-say-die challenges against players better than me is one trait, and the second is my inability to focus on any one pursuit/choice within a given game. I have known for a while the madness which spawned these characteristics, and they have long since ceased to surprise me.
Recently, however, my joining the StarCraft II Beta confronted me with an aspect of my character I had never seen before.
Despite my general quick acclimation to new games, good reflexes, past experience and stubbornness, I found myself fearful of actually testing my abilities against other people. I could not understand what was wrong with me. Why would I have any reason to be afraid to find out where I stood? If I was better than everyone, no big deal. If I was a terrible player, even better. I had no reason I knew of to be in a catatonic state of terror.
Eventually, after forcing myself through the obstacle, I understood. What I feared was the unknown, a genre I hadn't touched seriously in a decade. The fact that my brother proved head and shoulders better than me in both the original and StarCraft II didn't bother me directly, but it opened the door to doubts. I doubted whether I would be any good, whether I would learn and improve, and most critically whether I would live up to my name and history.
I was unknowingly wrapped up in the mythos of Me, the Undeniably Awesome Gamer. StarCraft II represented a grave threat to my understanding of me as a good gamer. I have failed utterly at other games and genres, but I never was particularly serious about them so they didn't matter (I'm on to you and your tricky oceans, Ace Combat). StarCraft II, however, whispered to me sweet, sickening invitations to prove myself a sham.
Afraid I would turn out to be normal, I shied away, forgetting that everyone starts a new or forgotten genre a complete nub and goes from there. As it turns out, everyone includes me.
Having gotten over myself, we'll see if I can't turn up the learning machine and become a kick butt player. In the mean time, I'll sit in the Bronze loser's league and nub it up.
The first two are straightforward, though I won't go into deep explanation. To summarize, my stubborn, never-say-die challenges against players better than me is one trait, and the second is my inability to focus on any one pursuit/choice within a given game. I have known for a while the madness which spawned these characteristics, and they have long since ceased to surprise me.
Recently, however, my joining the StarCraft II Beta confronted me with an aspect of my character I had never seen before.
Despite my general quick acclimation to new games, good reflexes, past experience and stubbornness, I found myself fearful of actually testing my abilities against other people. I could not understand what was wrong with me. Why would I have any reason to be afraid to find out where I stood? If I was better than everyone, no big deal. If I was a terrible player, even better. I had no reason I knew of to be in a catatonic state of terror.
Eventually, after forcing myself through the obstacle, I understood. What I feared was the unknown, a genre I hadn't touched seriously in a decade. The fact that my brother proved head and shoulders better than me in both the original and StarCraft II didn't bother me directly, but it opened the door to doubts. I doubted whether I would be any good, whether I would learn and improve, and most critically whether I would live up to my name and history.
I was unknowingly wrapped up in the mythos of Me, the Undeniably Awesome Gamer. StarCraft II represented a grave threat to my understanding of me as a good gamer. I have failed utterly at other games and genres, but I never was particularly serious about them so they didn't matter (I'm on to you and your tricky oceans, Ace Combat). StarCraft II, however, whispered to me sweet, sickening invitations to prove myself a sham.
Afraid I would turn out to be normal, I shied away, forgetting that everyone starts a new or forgotten genre a complete nub and goes from there. As it turns out, everyone includes me.
Having gotten over myself, we'll see if I can't turn up the learning machine and become a kick butt player. In the mean time, I'll sit in the Bronze loser's league and nub it up.
20100422
Artificial Incompetence
The StarCraft II beta is very much designed to be played against people. There is Artificial Intelligence you can choose to play against. However, Blizzard has perfectly simulated how a Ritalin driven child would play while being sedated and restricted to the use of only one finger.
They label this difficulty "Very Easy" and there are currently no others. I can think of very few friends of mine who, without ever having played StarCraft before, wouldn't be able to defeat one of these on their first try while blindfolded and forced to recite the Gettysburg address backwards. That's fine, it's labeled exactly what it is, but if you're an anti-social hermit the beta will not be for you.
They label this difficulty "Very Easy" and there are currently no others. I can think of very few friends of mine who, without ever having played StarCraft before, wouldn't be able to defeat one of these on their first try while blindfolded and forced to recite the Gettysburg address backwards. That's fine, it's labeled exactly what it is, but if you're an anti-social hermit the beta will not be for you.
20100421
Purchasing Power
Thanks to Gamestop's current promotion by which one may obtain a StarCraft II beta key via a preorder, I am now in possession of the StarCraft II beta.
Wheeeeeeeeee...............
Wheeeeeeeeee...............
20100420
20100417
It Came from the Blog
So recently there was a fairly unimportant topic regarding WoW that came up.
Lots of people talked about it, but you really don't care about any of them.
I read a lot of blog posts on the subject, partly because I found them interesting and partly because I didn't have time to sit down and read a book.
While I was reading these blogs, I began to notice something odd. It was present in most of the blogs and obvious almost from the moment I began reading, though I shrugged it off the first few times.
I noticed this, almost every sentence was separated out by two carriage returns.
Really, it looked exactly like this. Nearly a dozen blog posts where the sin of joining two sentences together was only brooked when one was sufficiently short.
I'm a person who's fairly big on writing and the theory behind doing so; these kinds of minutiae tickle my cerebrum in happy ways.
I can only describe this style of textual organization with one word, disjointed.
Reproducing the effect is difficult for me because I simply do not write in that style, but down to the very core of the text each sentence felt like a separate, lonely thought loosely connected to what came before and awaited after, drifting in a sea of confusion.
Some blog entries had the good grace to figure out halfway through that people were now in it for the long haul and one might be allowed to write complex sentences or even paragraphs -- sweet relief!
Not many did. This caused me great sadness.
Usually the last sentence was very short, as though the author petered out.
-----------------------------------
I can no longer maintain that facade, as doing so is causing my sanity preservation systems to suffer intolerable stress. Do the people of the internet truly think in minuscule, disconnected chunks? The nature of the byte is such, but surely not humanity. To continue would be to lessen myself as a sentient being, or at least deny my nature until I am forever broken.
Perhaps it is the modern education, or modern media, that has effected this madness.
We live in a culture of sound bites and flashy, brief declarative statements made without useful context or connection. The use of such constructs like paragraphs or even letters fades in the face of ever briefer bursts of communication. In concordance with our shorter attention spans, we are inclined to process data in smaller chunks.
Woe to us, however, if this permeates our psyches to the point where even our very thoughts become microscopic. There is beauty in the connections between ideas, events and memories, in the smooth flow of a whispering, fluid stream of mentality. Blending each instant of consciousness into the next is our assurance -- alas, the dreamless night that robs us of our security -- of our connection to our past and future selves. When thought itself becomes a series of brief flashes of notion, separated by clean breaks without context or binding ties, we lose the breadth and width of creation for a tiny, shallow world where nothing exists outside of the moment.
Perhaps I go too far in my waxing of philosophy, but even that askewed or even abusive misuse of the idiom warms me with ties to memories of my father chiding me for my malapropisms and many other connections which, in being stirred together in one motion, creates something both fearful and joyous. If I go too far, I at least have the confidence that my error only leads to dreams and abstract notions that encompass more and more of creation.
Still, I wonder if I am a time-lost relic, for as often as I am similar to my peers I am again so dissimilar as to wonder if I wasn't left on the doorstep of my generation by fourth dimension-traversing gypsies.
Lots of people talked about it, but you really don't care about any of them.
I read a lot of blog posts on the subject, partly because I found them interesting and partly because I didn't have time to sit down and read a book.
While I was reading these blogs, I began to notice something odd. It was present in most of the blogs and obvious almost from the moment I began reading, though I shrugged it off the first few times.
I noticed this, almost every sentence was separated out by two carriage returns.
Really, it looked exactly like this. Nearly a dozen blog posts where the sin of joining two sentences together was only brooked when one was sufficiently short.
I'm a person who's fairly big on writing and the theory behind doing so; these kinds of minutiae tickle my cerebrum in happy ways.
I can only describe this style of textual organization with one word, disjointed.
Reproducing the effect is difficult for me because I simply do not write in that style, but down to the very core of the text each sentence felt like a separate, lonely thought loosely connected to what came before and awaited after, drifting in a sea of confusion.
Some blog entries had the good grace to figure out halfway through that people were now in it for the long haul and one might be allowed to write complex sentences or even paragraphs -- sweet relief!
Not many did. This caused me great sadness.
Usually the last sentence was very short, as though the author petered out.
-----------------------------------
I can no longer maintain that facade, as doing so is causing my sanity preservation systems to suffer intolerable stress. Do the people of the internet truly think in minuscule, disconnected chunks? The nature of the byte is such, but surely not humanity. To continue would be to lessen myself as a sentient being, or at least deny my nature until I am forever broken.
Perhaps it is the modern education, or modern media, that has effected this madness.
We live in a culture of sound bites and flashy, brief declarative statements made without useful context or connection. The use of such constructs like paragraphs or even letters fades in the face of ever briefer bursts of communication. In concordance with our shorter attention spans, we are inclined to process data in smaller chunks.
Woe to us, however, if this permeates our psyches to the point where even our very thoughts become microscopic. There is beauty in the connections between ideas, events and memories, in the smooth flow of a whispering, fluid stream of mentality. Blending each instant of consciousness into the next is our assurance -- alas, the dreamless night that robs us of our security -- of our connection to our past and future selves. When thought itself becomes a series of brief flashes of notion, separated by clean breaks without context or binding ties, we lose the breadth and width of creation for a tiny, shallow world where nothing exists outside of the moment.
Perhaps I go too far in my waxing of philosophy, but even that askewed or even abusive misuse of the idiom warms me with ties to memories of my father chiding me for my malapropisms and many other connections which, in being stirred together in one motion, creates something both fearful and joyous. If I go too far, I at least have the confidence that my error only leads to dreams and abstract notions that encompass more and more of creation.
Still, I wonder if I am a time-lost relic, for as often as I am similar to my peers I am again so dissimilar as to wonder if I wasn't left on the doorstep of my generation by fourth dimension-traversing gypsies.
20100327
Saturday Morning Routine
11:30 AM: Wake up.
11:33 AM: Realize that there are only 27 more minutes in which to have a morning routine.
11:33 AM: Realize that there are only 27 more minutes in which to have a morning routine.
20100309
Suspense
So, Portal 2 is on its way this Christmas. At the same time, a sequel to Tron is on its way this Christmas. Oddly enough, the former prospect excites me while the latter prospect frightens me.
I think this is largely because the worst case scenario for Portal 2 is that it tries to be Portal. In such a scenario, you'll have the charm of the mind-bending puzzles, though perhaps you won't experience a plot that's quite as original, creepy, and involving. I can live with Portal 2's worst case scenario, and the potential jubilation the best case scenario represents is off the scale.
Tron Legacy is very different. The worst case scenario is an absolute disaster. The movie is clearly trying to be two things at once, a call back to arcade nostalgia and a flashy special effects film. Those are two very different beasts, and yoking them together will take a great feat of directorial and editorial strength. I suppose the best case scenario is that Tron 2 becomes the Starcraft of 3D special effects films, perfecting the formula and setting standards for decades to come. I'd definitely enjoy that, but I don't harbor a shred of hope that's what will happen.
I think this is largely because the worst case scenario for Portal 2 is that it tries to be Portal. In such a scenario, you'll have the charm of the mind-bending puzzles, though perhaps you won't experience a plot that's quite as original, creepy, and involving. I can live with Portal 2's worst case scenario, and the potential jubilation the best case scenario represents is off the scale.
Tron Legacy is very different. The worst case scenario is an absolute disaster. The movie is clearly trying to be two things at once, a call back to arcade nostalgia and a flashy special effects film. Those are two very different beasts, and yoking them together will take a great feat of directorial and editorial strength. I suppose the best case scenario is that Tron 2 becomes the Starcraft of 3D special effects films, perfecting the formula and setting standards for decades to come. I'd definitely enjoy that, but I don't harbor a shred of hope that's what will happen.
20100215
Buckle Up!
When I returned to my desk after standing up to grab a snack, I attempted to buckle my seat belt. It took me slightly longer than a half second to realize there was no seat belt on my desk chair.
20100210
Signs from God
People often imagine that signs from God are giant, epic visions of Charleton Heston on the mountain. They are more usually mundane.
A good example, seeing a dime in your freshly washed laundry and thinking, "Heh, I may not have listened to my mother and checked all my pockets, but it's not like I've ever accidentally washed anything important." and shortly thereafter seeing that you just washed your key cards that grant you access to your gated community and underground garage. Better yet, they still somehow work.
Why do I bring this up? No reason, no reason at all.
A good example, seeing a dime in your freshly washed laundry and thinking, "Heh, I may not have listened to my mother and checked all my pockets, but it's not like I've ever accidentally washed anything important." and shortly thereafter seeing that you just washed your key cards that grant you access to your gated community and underground garage. Better yet, they still somehow work.
Why do I bring this up? No reason, no reason at all.
20100204
Gaming Theory
Pet Project Idea: Social Study through gaming.
The concept is to create a fairly decent game along standard archetypes. The better quality of the game, the better. The game should be designed, coded, tested, refined, and completed as would be standard for any game of its kind.
The twist is this: there will be a clearly labeled "WIN THE GAME" button on the screen at all times during play. Before the game starts, there will be a clear statement of intentions. Clicking the "WIN THE GAME" button will win the game instantly. It will also record the time spent playing before the button was pressed. This information will be gathered on a website for public display.
I think this would be a very interesting social experiment, particularly in gathering information about the habits and mentality of gamers.
The concept is to create a fairly decent game along standard archetypes. The better quality of the game, the better. The game should be designed, coded, tested, refined, and completed as would be standard for any game of its kind.
The twist is this: there will be a clearly labeled "WIN THE GAME" button on the screen at all times during play. Before the game starts, there will be a clear statement of intentions. Clicking the "WIN THE GAME" button will win the game instantly. It will also record the time spent playing before the button was pressed. This information will be gathered on a website for public display.
I think this would be a very interesting social experiment, particularly in gathering information about the habits and mentality of gamers.
20091215
Come 2030.
Gaming itself is changing, and the ways in which this nascent industry is evolving are many. The future of gaming is multifaceted, uncertain, and even frightening.
First let us take stock of the three current spheres of gaming, PC, Console and Arcade.
Arcades are where gaming started, but are now largely relegated to gimmicks and unique control interfaces to continue functioning. The fall of the Arcade can be traced to the onset of home portable gaming. PCs were the harbinger, but it was console gaming that stole the masses away from the coin glutted cabinets of the Arcade. While Arcades maintained graphical dominance even into the fourth generation of consoles, the convenience, cheaper cost, and portability of consoles and eventually PCs as well overwhelmed the graphical edge of arcades. Shipping a new machine to arcades was more difficult than releasing a new video game to retailers. Today all Arcades can offer is a unique experience.
PCs today are relegated to a few niche genres and casual, browser-based games. PC gaming was to the Arcades as television is to movies, similar but overall an experience of lesser quality. This situation was turned on its head by the third generation of consoles, with gamers spending in excess to build PCs capable of graphics equivalent to or beyond what was affordable in arcade. This subculture of gaming riggers continues to this day, and still acts as the driving force behind hardware advances. Still, most PCs aren't built for such intense processing, and many people aren't interested in giving their children a reason to compete for time on the PC.
Consoles currently dominate the gaming industry. Consoles are the youngest sphere of gaming, created out of a desire for Arcade quality gaming at home. This feat wasn't possible on PCs back when the first Atari was released. PCs eventually were able to, but Consoles were far cheaper and matched Arcades. These factors continue to push the dominance of Consoles, even as the lines between them and PCs are blurred. Getting cheap, convenient quality is a hard bargain to pass up.
This is where we stand today. The detail is present because it gives some sense of from where and how the industry has traveled thus far. This is important in considering where gaming is going.
First, Console gaming is dying. This seems a drastic statement, but the situation is as I have stated. In two decades consoles as we know them will cease to be relevant. Consoles are already far diverged from the machines we saw in the 1980s. They are almost identical to PCs, save for their unique form factors and input devices. In a surprisingly short amount of time Consoles will be little more than prefabricated gaming PCs much like Alienware makes now. Proprietary installations will likely continue for some time, but it's only a matter of time before the homogenization of features and hardware begins to defeat the purpose of separate machines.
Second, there will obviously be a resurgence of PC gaming, though PCs as well will have changed. As consoles become more PC-like and the industry transitions back, demands will come for consistent, stable hardware on which PC gaming can be supported. All current Console makers will surreptitiously transform their consoles into gaming PCs, attempting to become the standard. Eventually one, or more likely Alienware, will emerge as the basic standard, with the others being only tangentially supported.
Third, a new sphere of gaming is fast emerging. While gameboys and the like have been around for years, only those with more than a passing interest would purchase them. Now, however, everyone has a phone or portable device capable of playing game, whether they initially intended to use it for that purpose or not. This sphere will dominate the casual market, and even introduce new genres possible only such a widely mobile and ubiquitous platform. Kids will catch Pokemon not by moving a virtual character around a virtual environment, but by walking to school, to the cafeteria, running around the yard and more.
Fourth, the future of Arcades will depend on future technology. In order to survive, Arcades need a new technology that is too expensive for home use. They might attempt to cache in on mobile gaming, acting as hot spots for events and special rewards. Failing a new technology, Arcades in the US will become an antiquity, something novel but no longer critical to the industry. They will continue to have relevance in Japan, but that will also diminish without a significant, unique hook.
Most importantly, by the time all this has come to pass a point will have been reached where additional graphical power is largely irrelevant. A difference will remain, but it will be largely unnecessary for story-experiencing purposes. Gaming will experience it's own "impressionist" movement, moving away from photo-realism and graphical superiority to more creative and interesting uses of computational power. Corporations, however, will act to squelch or ignore such titles initially as they will continue to trust in the staple genres and styles.
See you in 20 years.
First let us take stock of the three current spheres of gaming, PC, Console and Arcade.
Arcades are where gaming started, but are now largely relegated to gimmicks and unique control interfaces to continue functioning. The fall of the Arcade can be traced to the onset of home portable gaming. PCs were the harbinger, but it was console gaming that stole the masses away from the coin glutted cabinets of the Arcade. While Arcades maintained graphical dominance even into the fourth generation of consoles, the convenience, cheaper cost, and portability of consoles and eventually PCs as well overwhelmed the graphical edge of arcades. Shipping a new machine to arcades was more difficult than releasing a new video game to retailers. Today all Arcades can offer is a unique experience.
PCs today are relegated to a few niche genres and casual, browser-based games. PC gaming was to the Arcades as television is to movies, similar but overall an experience of lesser quality. This situation was turned on its head by the third generation of consoles, with gamers spending in excess to build PCs capable of graphics equivalent to or beyond what was affordable in arcade. This subculture of gaming riggers continues to this day, and still acts as the driving force behind hardware advances. Still, most PCs aren't built for such intense processing, and many people aren't interested in giving their children a reason to compete for time on the PC.
Consoles currently dominate the gaming industry. Consoles are the youngest sphere of gaming, created out of a desire for Arcade quality gaming at home. This feat wasn't possible on PCs back when the first Atari was released. PCs eventually were able to, but Consoles were far cheaper and matched Arcades. These factors continue to push the dominance of Consoles, even as the lines between them and PCs are blurred. Getting cheap, convenient quality is a hard bargain to pass up.
This is where we stand today. The detail is present because it gives some sense of from where and how the industry has traveled thus far. This is important in considering where gaming is going.
First, Console gaming is dying. This seems a drastic statement, but the situation is as I have stated. In two decades consoles as we know them will cease to be relevant. Consoles are already far diverged from the machines we saw in the 1980s. They are almost identical to PCs, save for their unique form factors and input devices. In a surprisingly short amount of time Consoles will be little more than prefabricated gaming PCs much like Alienware makes now. Proprietary installations will likely continue for some time, but it's only a matter of time before the homogenization of features and hardware begins to defeat the purpose of separate machines.
Second, there will obviously be a resurgence of PC gaming, though PCs as well will have changed. As consoles become more PC-like and the industry transitions back, demands will come for consistent, stable hardware on which PC gaming can be supported. All current Console makers will surreptitiously transform their consoles into gaming PCs, attempting to become the standard. Eventually one, or more likely Alienware, will emerge as the basic standard, with the others being only tangentially supported.
Third, a new sphere of gaming is fast emerging. While gameboys and the like have been around for years, only those with more than a passing interest would purchase them. Now, however, everyone has a phone or portable device capable of playing game, whether they initially intended to use it for that purpose or not. This sphere will dominate the casual market, and even introduce new genres possible only such a widely mobile and ubiquitous platform. Kids will catch Pokemon not by moving a virtual character around a virtual environment, but by walking to school, to the cafeteria, running around the yard and more.
Fourth, the future of Arcades will depend on future technology. In order to survive, Arcades need a new technology that is too expensive for home use. They might attempt to cache in on mobile gaming, acting as hot spots for events and special rewards. Failing a new technology, Arcades in the US will become an antiquity, something novel but no longer critical to the industry. They will continue to have relevance in Japan, but that will also diminish without a significant, unique hook.
Most importantly, by the time all this has come to pass a point will have been reached where additional graphical power is largely irrelevant. A difference will remain, but it will be largely unnecessary for story-experiencing purposes. Gaming will experience it's own "impressionist" movement, moving away from photo-realism and graphical superiority to more creative and interesting uses of computational power. Corporations, however, will act to squelch or ignore such titles initially as they will continue to trust in the staple genres and styles.
See you in 20 years.
20091113
Manga, ka?
Reading humorous manga on four hours of sleep is an exercise in fits of laughter.
I think I'm going to read all hilarious manga while sleep-deprived from now on.
I think I'm going to read all hilarious manga while sleep-deprived from now on.
20091112
20091111
Zoom zoom.
My mother often complains about movies being too much like video games. It was for this reason that she didn't like the recent Star Trek film. At the time, I had a factual understanding of her complaint, though I lacked a visceral understanding. Academically speaking, her issue was that directors enjoy swooshing a camera in and around the action, but for people who don't habitually subject themselves to this kind of visual overload it's too much to keep track of.
Last night I totally went bonkers for the same reason as my mother.
After dining with my grandparents, we turned on the television to give ample time for digestion before desert. Choosing to edify ourselves through PBS, we watched as NOVA discussed human ancestry and anthropology, likely due to some recent discoveries in the field.
I can't really be sure, because I was horribly distracted by the director's incredibly annoying camera work.
Our new generations have grown up in an era where information is instantly available, where attention spans are ever shorter, and where video games now involve flailing in front of the television. I can understand that NOVA, as it was when I was a kid, has to expend some effort updating its methods of operation to match the changing times. To remain the same is to become a fossil.
That said, they should find better directors.
There are a lot of tools at a director's disposal. The more obvious the tool, the less often it should be used. Otherwise the viewer becomes aware of the tool and it ceases to be illuminating. Instead, it beings to obscure in proportion to how much it is abused.
The director of this episode of Nova had an obsession with two forms of zoom. Form one was to start close to a picture, just enough that most of the important bits weren't visible, and then quickly zoom out to a fuller view with an graphical blur and refocus, accompanied by an audible whoosh. Form two was to start zoom out from a picture, just enough that most of the interesting bits were too small to make out, and then zoom in with the same graphical and audible effects as the other form.
The intent was obviously to make pictures of skeletons and anthropologists exciting. However, the frequency with which these zooms occurred, and the short duration the technique alloted to actually look at the skeletons or interesting photos, created a situation where it was nearly impossible to actually appreciate whatever it was the director wanted you to look at.
I could almost have sworn there were two little kids fighting over the zoom function on the camera, all while talking in whoosh noises.
In addition to this visual repetition, the program itself was arranged with many, many repeated narrations. I can scarcely remember just how many times the narrator said, "For the first time in X years...", "Then, there was an amazing discovery...", and similar phrases. I could potentially see the worth in continually repeating the weird names of the skeletons, given that they aren't easy to remember of learn. But for the love of variety don't say the name with exactly the same inflection, tone, pitch and feeling every single time.
The whole presentation felt like a broken record being played over a projector with bits and pieces of a child's wild drawings thrown in. By the fifth amazing discovery I couldn't bear to watch it anymore. Not that I could have seen anything anyway what with all the blur-zooming going on. It might have been better if there wasn't that conspicuous whooshing noise there every single time.
Grr. Get off my lawn!
Last night I totally went bonkers for the same reason as my mother.
After dining with my grandparents, we turned on the television to give ample time for digestion before desert. Choosing to edify ourselves through PBS, we watched as NOVA discussed human ancestry and anthropology, likely due to some recent discoveries in the field.
I can't really be sure, because I was horribly distracted by the director's incredibly annoying camera work.
Our new generations have grown up in an era where information is instantly available, where attention spans are ever shorter, and where video games now involve flailing in front of the television. I can understand that NOVA, as it was when I was a kid, has to expend some effort updating its methods of operation to match the changing times. To remain the same is to become a fossil.
That said, they should find better directors.
There are a lot of tools at a director's disposal. The more obvious the tool, the less often it should be used. Otherwise the viewer becomes aware of the tool and it ceases to be illuminating. Instead, it beings to obscure in proportion to how much it is abused.
The director of this episode of Nova had an obsession with two forms of zoom. Form one was to start close to a picture, just enough that most of the important bits weren't visible, and then quickly zoom out to a fuller view with an graphical blur and refocus, accompanied by an audible whoosh. Form two was to start zoom out from a picture, just enough that most of the interesting bits were too small to make out, and then zoom in with the same graphical and audible effects as the other form.
The intent was obviously to make pictures of skeletons and anthropologists exciting. However, the frequency with which these zooms occurred, and the short duration the technique alloted to actually look at the skeletons or interesting photos, created a situation where it was nearly impossible to actually appreciate whatever it was the director wanted you to look at.
I could almost have sworn there were two little kids fighting over the zoom function on the camera, all while talking in whoosh noises.
In addition to this visual repetition, the program itself was arranged with many, many repeated narrations. I can scarcely remember just how many times the narrator said, "For the first time in X years...", "Then, there was an amazing discovery...", and similar phrases. I could potentially see the worth in continually repeating the weird names of the skeletons, given that they aren't easy to remember of learn. But for the love of variety don't say the name with exactly the same inflection, tone, pitch and feeling every single time.
The whole presentation felt like a broken record being played over a projector with bits and pieces of a child's wild drawings thrown in. By the fifth amazing discovery I couldn't bear to watch it anymore. Not that I could have seen anything anyway what with all the blur-zooming going on. It might have been better if there wasn't that conspicuous whooshing noise there every single time.
Grr. Get off my lawn!
20090915
20090831
Subscribe to:
Posts (Atom)