Bwahahahaha!
I don't know why, but for some reason that seemed appropriate.
Anyway...
I think I'll keep spacing out this post in this unique, per sentence fashion.
As it is, the reason for this particular entry into my journal of me is the listing of three wonderful items for sale on eBay!
That's right, YOU could be the proud owner of something I've already graced with my merest touch.
I'm auctioning off three anime DVD sets that I didn't really care for, and didn't relate to well.
Two are of the same series, and are probably as child friendly as they come.
The other is most definately adult.
In any case, all the formatting I had to do to make my auctions pretty made me sick of paragraphs and html for a while.
Hence the weirdness.
In conclusion, any interested parties should follow the below links (which upon conclusion will be print screened and hosted somewhere so that in years hence there aren't a plethora of dead ebay related links lying around here).
Heaven forbid slashdot go under, my blog linkage would be ruined!
Magic Knight Rayearth Season 1
Magic Knight Rayearth Season 2
Neon Genesis Evangelion
I realize some of you may be disappointed to see I did not include Trigun or Cowboy Bebop amoung the auctions.
My only response is, why would I want to sell anything so wildly awesome as Trigun or Cowboy Bebop?
Have fun!
20050729
20050727
Waiting
Exactly why I have patience, I'm not sure. This isn't a why in the sense of patience shouldn't be had, by a why in the sense of why have I been blessed with a sizable sum of that particular attribute. Even that isn't entirely fair, because my patience can be selective, and is shorter in some areas than others.
In any case, to some degree I suspect my patience comes from my longstanding relationship with Blizzard, the makers of WarCraft, StarCraft and Diablo. Anyone who has ever waiting for an upcoming Blizzard game can not but know that being a fan of Blizzard requires being a fan of waiting.
Example 1: StarCraft. The Mac version was released well after the PC version, causing me to have to wait an addition six months before I could get the game.
Example 2: Diablo II: Lord of Destruction. A particular feature, Rune Words, was only partially implemented at launch, and it took years before it was finally finished fully.
Example 3: WarCraft III. This game was delayed by about a year or more (such that the Blizzard website claimed they were aiming for X release date, a month after it had passed).
Example 4: StarCraft: Ghost. This particular example will better reveal to you how the process actually works. A lot of the details for the previous examples were really foggy for me, but the waiting was very similar to the following details. StarCraft: Ghost was announced through waiting. Literally, the entrance to Blizzard's website, for twenty four hours, was this image that slowly grew more clear as time went on, along with a countdown to midnight PST. At that point, the release date was slated to be the Christmas of my Freshman year of college. Come November, that date was pushed back to February. Come February it was pushed back to April. That March, I preordered the game (and still have the preorder). However, come April the game was again delayed, this time until June. Come May, the game was delayed until November. As the summer ended, Blizzard's partner in the game left, and the game was delayed indefinately. Now, it will be while I am in Japan (fat lot of good that preorder will do me now) this Fall that the game will finally be released, a full three years after the initial projected date.
Understand that the examples prior to 4 worked in very similar ways. This is how Blizzard is training a generation of gamers to be patient.
In any case, yesterday I learned that every Tuesday there is maintainance on the servers for World of WarCraft. I also learned that, much like release dates, the time when the maintainance will be finished is usually much later than projected. In a similar way to the waits for the Mac versions of their software (that ended after Blizzard released WarCraft III as a Mac/PC hybrid CD) all of the servers I play on had special, extended maintainance.
So, I had to wait, a lot.
The good thing is, my past experience with Blizzard trained me to do something constructive in the meantime. I wasn't a part of the millions of frothing gamers refreshmonkeying Blizzard's website showing the current status of the servers. Out of curiousity, I tried to connect to the site a few times, but it might as well have been a denial of service attack, there were so many people holding their breaths (and occaisionally lapsing into unconciousness for holding too long).
It really isn't surprising then that as of this very sentence being typed, my servers are still down.
So, I must continue to be patient. And so I leave you with this brief quip (added to my random quotes).
Me: I need to exercise patience.
Friend: Indeed.
Me: Darn you patience! You need to lose wait.
In any case, to some degree I suspect my patience comes from my longstanding relationship with Blizzard, the makers of WarCraft, StarCraft and Diablo. Anyone who has ever waiting for an upcoming Blizzard game can not but know that being a fan of Blizzard requires being a fan of waiting.
Example 1: StarCraft. The Mac version was released well after the PC version, causing me to have to wait an addition six months before I could get the game.
Example 2: Diablo II: Lord of Destruction. A particular feature, Rune Words, was only partially implemented at launch, and it took years before it was finally finished fully.
Example 3: WarCraft III. This game was delayed by about a year or more (such that the Blizzard website claimed they were aiming for X release date, a month after it had passed).
Example 4: StarCraft: Ghost. This particular example will better reveal to you how the process actually works. A lot of the details for the previous examples were really foggy for me, but the waiting was very similar to the following details. StarCraft: Ghost was announced through waiting. Literally, the entrance to Blizzard's website, for twenty four hours, was this image that slowly grew more clear as time went on, along with a countdown to midnight PST. At that point, the release date was slated to be the Christmas of my Freshman year of college. Come November, that date was pushed back to February. Come February it was pushed back to April. That March, I preordered the game (and still have the preorder). However, come April the game was again delayed, this time until June. Come May, the game was delayed until November. As the summer ended, Blizzard's partner in the game left, and the game was delayed indefinately. Now, it will be while I am in Japan (fat lot of good that preorder will do me now) this Fall that the game will finally be released, a full three years after the initial projected date.
Understand that the examples prior to 4 worked in very similar ways. This is how Blizzard is training a generation of gamers to be patient.
In any case, yesterday I learned that every Tuesday there is maintainance on the servers for World of WarCraft. I also learned that, much like release dates, the time when the maintainance will be finished is usually much later than projected. In a similar way to the waits for the Mac versions of their software (that ended after Blizzard released WarCraft III as a Mac/PC hybrid CD) all of the servers I play on had special, extended maintainance.
So, I had to wait, a lot.
The good thing is, my past experience with Blizzard trained me to do something constructive in the meantime. I wasn't a part of the millions of frothing gamers refreshmonkeying Blizzard's website showing the current status of the servers. Out of curiousity, I tried to connect to the site a few times, but it might as well have been a denial of service attack, there were so many people holding their breaths (and occaisionally lapsing into unconciousness for holding too long).
It really isn't surprising then that as of this very sentence being typed, my servers are still down.
So, I must continue to be patient. And so I leave you with this brief quip (added to my random quotes).
Me: I need to exercise patience.
Friend: Indeed.
Me: Darn you patience! You need to lose wait.
20050725
Ill Omens
The Good News: There probably isn't anything wrong with my windows installation, probably.
The Bad News: There probably is a lot wrong with my RAID array, probably.
Basically, these blue screens of death I've been getting appear everytime one of my hard drives vanishes. When I saw vanishes, I mean it can no longer be detected by my motherboard. I do not know why this is yet, but I do know that it is A) a pain and B) a very, very big problem. Until I can crash my system enough to understand what hard drives are being problematic and which aren't, or even if it is a matter of any particular hard drive at all, I'm not going to bother reinstalling World of WarCraft (whose blue screen of death tendency may really be because the greatest amount of running time on Raijin is spent on it), or installing developement tools.
In any case, I've got an hour now to wait for my RAID array to be rebuilt, again. I have several suspicions as to why this is happening.
1) I'm not clearing out the drives that are being problematic when I rebuild, resulting in conflicting data and thus crashes.
2) One or two of my hard drives are on the fritz, and will require replacement.
3) Things have just been generally screwed up with my installation of, everything, and will need to be redone from scratch, at which point I might as well retry things with that RAID card that screwed everything up in the first place.
Obviously, these are listed in order from what I hope is the case, to what I fear is the case.
Get well soon, Raijin. :(
The Bad News: There probably is a lot wrong with my RAID array, probably.
Basically, these blue screens of death I've been getting appear everytime one of my hard drives vanishes. When I saw vanishes, I mean it can no longer be detected by my motherboard. I do not know why this is yet, but I do know that it is A) a pain and B) a very, very big problem. Until I can crash my system enough to understand what hard drives are being problematic and which aren't, or even if it is a matter of any particular hard drive at all, I'm not going to bother reinstalling World of WarCraft (whose blue screen of death tendency may really be because the greatest amount of running time on Raijin is spent on it), or installing developement tools.
In any case, I've got an hour now to wait for my RAID array to be rebuilt, again. I have several suspicions as to why this is happening.
1) I'm not clearing out the drives that are being problematic when I rebuild, resulting in conflicting data and thus crashes.
2) One or two of my hard drives are on the fritz, and will require replacement.
3) Things have just been generally screwed up with my installation of, everything, and will need to be redone from scratch, at which point I might as well retry things with that RAID card that screwed everything up in the first place.
Obviously, these are listed in order from what I hope is the case, to what I fear is the case.
Get well soon, Raijin. :(
20050724
r313371ng
Forgiving the title, here's an update.
My new computer, Raijin, is getting better. That's not to say I've been without problems, but I'm slowly killing them. There have been a couple RAID rebuilds (resultant of some random instances of hard drives being undetected, and a random split of the array), and some blue screens of death (my Dad tells me there are no blue screens of death in XP, I have tangible evidence to the contrary). In any case, the coming days will entail the following.
-An evaluation of whether I should reinstall XP (doesn't seem likely, but maybe)
-Reinstalling World of WarCraft (current chief blue screener)
-Installing Dev C++ (for programming purposes)
-Not reinstalling that demonic RAID card that screwed everything up
-Finding a different way to install a CD-ROM
However, at the moment I'm pretty content with how things are. Everything that is left is pretty inconsequential. So unless I continue to get odd stuff happening with my RAID array (the random array split, and missing hard incident post-rebuild have me worried) everything should be fine.
In other news, wish my Dad a happy birthday!
My new computer, Raijin, is getting better. That's not to say I've been without problems, but I'm slowly killing them. There have been a couple RAID rebuilds (resultant of some random instances of hard drives being undetected, and a random split of the array), and some blue screens of death (my Dad tells me there are no blue screens of death in XP, I have tangible evidence to the contrary). In any case, the coming days will entail the following.
-An evaluation of whether I should reinstall XP (doesn't seem likely, but maybe)
-Reinstalling World of WarCraft (current chief blue screener)
-Installing Dev C++ (for programming purposes)
-Not reinstalling that demonic RAID card that screwed everything up
-Finding a different way to install a CD-ROM
However, at the moment I'm pretty content with how things are. Everything that is left is pretty inconsequential. So unless I continue to get odd stuff happening with my RAID array (the random array split, and missing hard incident post-rebuild have me worried) everything should be fine.
In other news, wish my Dad a happy birthday!
20050720
Anger Management
Right now, I am a coiled ball of malice and hatred ready to explode into an infinite dimension of pain and suffering for whoever created the expansion card I plugged into my 1337 computer at approximately three PM today. Here's why.
My 1337 computer, in all its glory, had its beautiful RAID array utterly dismantled by the evil bastard of a piece of eletronics. Upon plugging the little devil of a PATA slot giving card into my PCI slot, it proceeded to attempt to supercede my motherboard by claiming to be the PATA slots the motherboard itself had.
This resulted in incredible confusion, during which my motherboard completely disassembled my RAID array and became generally screwed up. Actually, screwed up is barely an adaquate description of what happened, later...
In any case, I quickly found that by unplugging the malevolent card, my computer could once again boot properly. Seeing this, I tried to install drivers for my last and surely not least (in terms of pure suffering it has laid upon me) component. Upon doing so and hooking up the beast again, the problem only got worse.
I unplugged the beast again, only to find that instead of one healthy RAID array, my hard disks were now split into two RAID 0+1 arrays. Aside from the obvious (to geeks anyway) fact that it is impossible to have a 0+1 RAID array with only two hard drives, there was an immediate "What the...?" followed by a rather large increase in stress as well as the desire to do to my destroyer what the guys from Office Space did to that printer.
After two hours of toil, I was finally able to piece my RAID array back together, and boot up properly. However, just now as my RAID array was rebuilding, I tried to run an application, and the whole computer crashed hard. Now, upon booting, the BIOS sits around forever trying to detect the RAID array (and never stops). Hell, what does it matter anymore seeing as how the registry entries for every program on my computer were screwed up by this as it was.
In any case, Me is not a happy Me at the moment.
On a happier note, because you've had to deal with my current fury, God bless you.
My 1337 computer, in all its glory, had its beautiful RAID array utterly dismantled by the evil bastard of a piece of eletronics. Upon plugging the little devil of a PATA slot giving card into my PCI slot, it proceeded to attempt to supercede my motherboard by claiming to be the PATA slots the motherboard itself had.
This resulted in incredible confusion, during which my motherboard completely disassembled my RAID array and became generally screwed up. Actually, screwed up is barely an adaquate description of what happened, later...
In any case, I quickly found that by unplugging the malevolent card, my computer could once again boot properly. Seeing this, I tried to install drivers for my last and surely not least (in terms of pure suffering it has laid upon me) component. Upon doing so and hooking up the beast again, the problem only got worse.
I unplugged the beast again, only to find that instead of one healthy RAID array, my hard disks were now split into two RAID 0+1 arrays. Aside from the obvious (to geeks anyway) fact that it is impossible to have a 0+1 RAID array with only two hard drives, there was an immediate "What the...?" followed by a rather large increase in stress as well as the desire to do to my destroyer what the guys from Office Space did to that printer.
After two hours of toil, I was finally able to piece my RAID array back together, and boot up properly. However, just now as my RAID array was rebuilding, I tried to run an application, and the whole computer crashed hard. Now, upon booting, the BIOS sits around forever trying to detect the RAID array (and never stops). Hell, what does it matter anymore seeing as how the registry entries for every program on my computer were screwed up by this as it was.
In any case, Me is not a happy Me at the moment.
On a happier note, because you've had to deal with my current fury, God bless you.
20050718
1337 5k33lz
Booyaka.
Right now I'm writing this using my 1337 new computer, Raijin (God of Lighting). It is truly 1337, and I can not stress how awesome it truly is. Hopefully some of you will get a firsthand look in the coming month before I go to Japan.
In any case, the only missing link is the expansion card that will allow me to install my CD/DVD-ROM drive, thus giving me access to my vast collection of Blizzard games and other ones as well. Until it arrives (either this afternoon or tomorrow) I'll have to content myself with Counter Strike.
I toiled long and hard on this, and I came out with something awesome. A lot of kudos go to God, for keeping me from destroying my computer parts accidentally, and granting me keen insight into the incredible lack of information in the manual for my motherboard, and the complete lack of a manual for the computer case. Surely it was he that allowed me to so easily find that forum that explained away my confusion concerning setting up RAID with my motherboard.
My Dad now claims he and everyone else will now come to me with any hardware related computer trouble. I don't know if that's wise...
Anyway, time to save some hostages.
Right now I'm writing this using my 1337 new computer, Raijin (God of Lighting). It is truly 1337, and I can not stress how awesome it truly is. Hopefully some of you will get a firsthand look in the coming month before I go to Japan.
In any case, the only missing link is the expansion card that will allow me to install my CD/DVD-ROM drive, thus giving me access to my vast collection of Blizzard games and other ones as well. Until it arrives (either this afternoon or tomorrow) I'll have to content myself with Counter Strike.
I toiled long and hard on this, and I came out with something awesome. A lot of kudos go to God, for keeping me from destroying my computer parts accidentally, and granting me keen insight into the incredible lack of information in the manual for my motherboard, and the complete lack of a manual for the computer case. Surely it was he that allowed me to so easily find that forum that explained away my confusion concerning setting up RAID with my motherboard.
My Dad now claims he and everyone else will now come to me with any hardware related computer trouble. I don't know if that's wise...
Anyway, time to save some hostages.
20050716
Daisy....da....i...s..y....
Construction on my soon to have a really cool japanese name 1337 computer has begun. The Motherboard, CPU, Graphics Card and Memory are all in, and most of the fans. The only complication that remains is in setting up RAID for my hard drives (the manual doesn't go into it) and hooking up the DVD/CD-ROM. There's a slight chance I'll have to borrow a floppy drive to set up RAID, but there are a few computers I can do that with.
I may eventually overclock my weapon, but as I don't understand optimizing overclocking and preventing explosions yet, that's a far off project.
In other news, I'm done bothering to read any more comments on Rockstar's "Hot Coffee" debacle. I won't even bother posting a link to pertinent information. There's not much to know.
Basically, Rockstar coded a mini-game into the latest Grand Theft Auto game that had the game's main character having sex. Whether from some sudden decency (unlikely), some sense of incompleteness, a sense of how lame the idea was, or fear of retribution, Rockstar blocked the content off.
The problem arises from the nature of how they blocked it off. It was a coded detour around the mini-game, as opposed to an actual removal of the pertinent code. When some skilled, and likely bored, hackers poked around in the game's code, they found the mini-game and released a mod to unlock it.
Now Rockstar is under fire for not alerting the ESRB to this minigame. Such famous people as Hillary Clinton and the ever persistant Jack Thompson are talking (and raving) about the issue. As a be all and end all, I'll make my points and never speak about it again, unless some crazy legislation passes that will cut back my first ammendment rights as a video game developer.
Point 1: Rockstar is, partly, responsible in that they did not sufficiently remove the code. They obviously took the time to make a detour for it, it shouldn't have been too much harder to remove it (although not easy as pie).
Point 2: It seems utterly ludicrous to me that a game company should include in their presentation to the ESRB anything that will not be playable in the end game (as the company finalizes it). Any additions or gameplay changes by others should be seperate from this.
Point 3: Jack Thompson is still out of touch with reality.
Point 4: My greatest fear in all this is the legislative possibilities that might arise. The complete hassel for developers to show to the ESRB every last nook and cranny of their game, including features that aren't complete, won't be completed, and will be removed, will only hurt the industry and further hinder the efforts of small, independent developers. The creation of a new ratings system might not be so bad, in that when it kicks in and the problem isn't solved because the ESRB worked and the problem lies greatly with parents who buy the games for their kids, people might realize that they need to take responsibility and be parents. Still, a new system is simply more likely to make life a living hell for anyone who wants to sell or buy video games.
I'm picturing a world where I'm on a list of "VG offenders" that ignorant people can look up and think I'm a bad person because I've played FPSs. A world where buying video games that aren't all sugar drop fairies and gum drops takes a background check more thourough than one for firearms. Heck, a world where developing games requires several hundred liscences which need to be continually renewed at high cost. And while we're at it, the Japanese will get nuked again for being the source of the problem.
The sad thing is, only the last sentence was the least bit improbable.
I may eventually overclock my weapon, but as I don't understand optimizing overclocking and preventing explosions yet, that's a far off project.
In other news, I'm done bothering to read any more comments on Rockstar's "Hot Coffee" debacle. I won't even bother posting a link to pertinent information. There's not much to know.
Basically, Rockstar coded a mini-game into the latest Grand Theft Auto game that had the game's main character having sex. Whether from some sudden decency (unlikely), some sense of incompleteness, a sense of how lame the idea was, or fear of retribution, Rockstar blocked the content off.
The problem arises from the nature of how they blocked it off. It was a coded detour around the mini-game, as opposed to an actual removal of the pertinent code. When some skilled, and likely bored, hackers poked around in the game's code, they found the mini-game and released a mod to unlock it.
Now Rockstar is under fire for not alerting the ESRB to this minigame. Such famous people as Hillary Clinton and the ever persistant Jack Thompson are talking (and raving) about the issue. As a be all and end all, I'll make my points and never speak about it again, unless some crazy legislation passes that will cut back my first ammendment rights as a video game developer.
Point 1: Rockstar is, partly, responsible in that they did not sufficiently remove the code. They obviously took the time to make a detour for it, it shouldn't have been too much harder to remove it (although not easy as pie).
Point 2: It seems utterly ludicrous to me that a game company should include in their presentation to the ESRB anything that will not be playable in the end game (as the company finalizes it). Any additions or gameplay changes by others should be seperate from this.
Point 3: Jack Thompson is still out of touch with reality.
Point 4: My greatest fear in all this is the legislative possibilities that might arise. The complete hassel for developers to show to the ESRB every last nook and cranny of their game, including features that aren't complete, won't be completed, and will be removed, will only hurt the industry and further hinder the efforts of small, independent developers. The creation of a new ratings system might not be so bad, in that when it kicks in and the problem isn't solved because the ESRB worked and the problem lies greatly with parents who buy the games for their kids, people might realize that they need to take responsibility and be parents. Still, a new system is simply more likely to make life a living hell for anyone who wants to sell or buy video games.
I'm picturing a world where I'm on a list of "VG offenders" that ignorant people can look up and think I'm a bad person because I've played FPSs. A world where buying video games that aren't all sugar drop fairies and gum drops takes a background check more thourough than one for firearms. Heck, a world where developing games requires several hundred liscences which need to be continually renewed at high cost. And while we're at it, the Japanese will get nuked again for being the source of the problem.
The sad thing is, only the last sentence was the least bit improbable.
20050713
Escapism
If you've got some interest in video games, or where that industry is going, The Escapist is an interesting read. It's a magazine that just started recently, and is a wonderful internet journal.
That's not to say I necessarily agree with everything said within the articles, but what is said is said very well and obviously had far more thought put into it than the rest of the internet (although that isn't necessarily saying much).
In any case, the one article I can say I agreed with the least had to be The Contrarion. The fundamental argument of the article was that standardization eventually kill mutants, and that because of that principle Nintendo is doomed (hardware-wise). This summary hardly does the article justice, so I suggest you read it yourself before reading any further.
The author uses some pertinent experiences to illustrate the major point, as well as having his argument remain consistant and connected. However, I don't think that the argument holds. One of his early points is that "ubiquity only works when married with standardization". This is the first hole in his argument.
Nintendo's hardware has largely defined what is standard, especially in terms of the controller. The analog stick, the rumble pack, and the cardinally placed button layout seen on most controllers were all Nintendo's before anyone elses. In fact, Nintendo has been so influential that aspects of their controllers have even be hastily mimicked in a desperate attempt to beat Nintendo to their own innovation. In any case, Nintendo has largely defined hardware in that regard, and made it standard.
In fact, it can be said that for every home console that Nintendo brought forth, they standardized something. The NES redefined what a controller was, and the SNES did it again with the already mentioned cardinal button placement (ABXY) and shoulder buttons as well and the N64 added the analog stick and the rumble feature. All of these were copied by their competitors. More notable still is the now default support of four or more controllers also first appeared on the N64.
Admittedly, the GameCube was quite as standard setting as its predecessors. I continued to innovate, being smaller and far more durable than the competition, and including shoulder buttons that were sensitive to not only being pressed in, but how far they were pressed in. However, for perhaps the first time Nintendo didn't introduce hardware features that were necessary to standardize.
In any case, with all the certainty that Sony and Microsoft are eagerly awaiting their chance to copy Nintendo once the Revolution is revealed, the risk of Nintendo's hardware innovations failing to standardize seems rather small.
My other major critique of the author is centered on his perception of the state of affairs in the coming generation and Nintendo's current situation. While everyone has their own projections for what will happen in terms of victors and spoils next generation, there is hardly any propoganda to go one, let alone actual a set of hard data. Until more is revealed about any of the consoles aside from how they look and theoretical computational values, we honestly can't say for certain that a system will do well or poorly. We can say what we think, but it isn't certain. In that regard, the author calls the Revolution "dead on arrival". In the same way I say it'll be quite alive, but as it stands it all boils down to speculative opinion. It has to be conceeded that my opinion or any other must accept that with as little information as we have, change is possible, if not probable.
As Nintendo's current situation, the author says the GameCube has been "buried" and the GameBoy is "drowning". I can't agree with either of these statements. Regardless of whether the author includes the DS along with the GameBoy (despite their being seperate systems, the author continually refers to the "GameBoy DS"), I see hundreds of middle school kids and their younger siblings carrying GameBoys, but I hardly ever see a PSP or DS. It's simply far too early to start making a coffin for the premier handheld emporer who has reigned for countless years. As for the GameCube, it and the Xbox did approximately as well as one another. As Nintendo made money for every GameCube sold, and Microsoft lost money for every Xbox, if the GameCube is buried, the Xbox had a mine shaft collapse on it.
The author does make some very good points. The point concerning publishers and cross platform games is poignant, and needs to be considered carefully. Even a fledgling video game developer like myself understands that porting a game between systems is hard enough without having to worry about additional features that different console may have. The ease of sticking with something familiar is alluring.
Additionally, his argument concerning the GameBoy Micro is pretty irrefutable. Perhaps it's my lack of caring about things that are marketed to people like me, but the micro seemed to be a waste of time to me. It might do well, but if anyone's buying, it'll be the young gamers and not people like me.
In closing, however, the author jumps from denouncing Nintendo's attempts to innovate hardware, which was his strongest argument, to applying that to Nintendo's games. There's a jump from the well established and plausible argument of "Nintendo's innovation in hardware will be killed by what's standard" to "Nintendo's innovation in both hardware and software will be killed by what's standard". The author cites upcoming and already released titles marked for their innovation, and somehow his argument concerning Nintendo's failing hardware is supposed to apply.
As I said, his argument concerning hardware was plausible. Despite Nintendo's previous innovations, the GameCube hardly standardized anything as I said myself. It's entirely possible that Nintendo might flop with the Revolution and succumb to simply making games for other people. I wholly disagree with the notion thrust forward at the end of the article that Nintendo will fall wholly and fully, condemning us to unoriginal titles and hardware. It just doesn't follow from his argument.
Given standardized hardware, people will still innovate within games because consumers will demand it. Gamers have proven consistantly that selling us a game that is barely different from its predecessor doesn't work. Unless something is significantly changed, the series will rot in obscurity. If Nintendo stops making hardware, that doesn't immediately mean that gamers will suddenly lose their ability to discern that GTA27 didn't change anything but a few buildings from GTA26. Were such a situation to occur, gamers would simply stop buying games much like the first Video Game Crash back before the NES.
So, I'll conceed a lot of the author's points, but I come to a very different conclusion. If there was no Nintendo, it would necessary for Sony/Microsoft to create one.
That's not to say I necessarily agree with everything said within the articles, but what is said is said very well and obviously had far more thought put into it than the rest of the internet (although that isn't necessarily saying much).
In any case, the one article I can say I agreed with the least had to be The Contrarion. The fundamental argument of the article was that standardization eventually kill mutants, and that because of that principle Nintendo is doomed (hardware-wise). This summary hardly does the article justice, so I suggest you read it yourself before reading any further.
The author uses some pertinent experiences to illustrate the major point, as well as having his argument remain consistant and connected. However, I don't think that the argument holds. One of his early points is that "ubiquity only works when married with standardization". This is the first hole in his argument.
Nintendo's hardware has largely defined what is standard, especially in terms of the controller. The analog stick, the rumble pack, and the cardinally placed button layout seen on most controllers were all Nintendo's before anyone elses. In fact, Nintendo has been so influential that aspects of their controllers have even be hastily mimicked in a desperate attempt to beat Nintendo to their own innovation. In any case, Nintendo has largely defined hardware in that regard, and made it standard.
In fact, it can be said that for every home console that Nintendo brought forth, they standardized something. The NES redefined what a controller was, and the SNES did it again with the already mentioned cardinal button placement (ABXY) and shoulder buttons as well and the N64 added the analog stick and the rumble feature. All of these were copied by their competitors. More notable still is the now default support of four or more controllers also first appeared on the N64.
Admittedly, the GameCube was quite as standard setting as its predecessors. I continued to innovate, being smaller and far more durable than the competition, and including shoulder buttons that were sensitive to not only being pressed in, but how far they were pressed in. However, for perhaps the first time Nintendo didn't introduce hardware features that were necessary to standardize.
In any case, with all the certainty that Sony and Microsoft are eagerly awaiting their chance to copy Nintendo once the Revolution is revealed, the risk of Nintendo's hardware innovations failing to standardize seems rather small.
My other major critique of the author is centered on his perception of the state of affairs in the coming generation and Nintendo's current situation. While everyone has their own projections for what will happen in terms of victors and spoils next generation, there is hardly any propoganda to go one, let alone actual a set of hard data. Until more is revealed about any of the consoles aside from how they look and theoretical computational values, we honestly can't say for certain that a system will do well or poorly. We can say what we think, but it isn't certain. In that regard, the author calls the Revolution "dead on arrival". In the same way I say it'll be quite alive, but as it stands it all boils down to speculative opinion. It has to be conceeded that my opinion or any other must accept that with as little information as we have, change is possible, if not probable.
As Nintendo's current situation, the author says the GameCube has been "buried" and the GameBoy is "drowning". I can't agree with either of these statements. Regardless of whether the author includes the DS along with the GameBoy (despite their being seperate systems, the author continually refers to the "GameBoy DS"), I see hundreds of middle school kids and their younger siblings carrying GameBoys, but I hardly ever see a PSP or DS. It's simply far too early to start making a coffin for the premier handheld emporer who has reigned for countless years. As for the GameCube, it and the Xbox did approximately as well as one another. As Nintendo made money for every GameCube sold, and Microsoft lost money for every Xbox, if the GameCube is buried, the Xbox had a mine shaft collapse on it.
The author does make some very good points. The point concerning publishers and cross platform games is poignant, and needs to be considered carefully. Even a fledgling video game developer like myself understands that porting a game between systems is hard enough without having to worry about additional features that different console may have. The ease of sticking with something familiar is alluring.
Additionally, his argument concerning the GameBoy Micro is pretty irrefutable. Perhaps it's my lack of caring about things that are marketed to people like me, but the micro seemed to be a waste of time to me. It might do well, but if anyone's buying, it'll be the young gamers and not people like me.
In closing, however, the author jumps from denouncing Nintendo's attempts to innovate hardware, which was his strongest argument, to applying that to Nintendo's games. There's a jump from the well established and plausible argument of "Nintendo's innovation in hardware will be killed by what's standard" to "Nintendo's innovation in both hardware and software will be killed by what's standard". The author cites upcoming and already released titles marked for their innovation, and somehow his argument concerning Nintendo's failing hardware is supposed to apply.
As I said, his argument concerning hardware was plausible. Despite Nintendo's previous innovations, the GameCube hardly standardized anything as I said myself. It's entirely possible that Nintendo might flop with the Revolution and succumb to simply making games for other people. I wholly disagree with the notion thrust forward at the end of the article that Nintendo will fall wholly and fully, condemning us to unoriginal titles and hardware. It just doesn't follow from his argument.
Given standardized hardware, people will still innovate within games because consumers will demand it. Gamers have proven consistantly that selling us a game that is barely different from its predecessor doesn't work. Unless something is significantly changed, the series will rot in obscurity. If Nintendo stops making hardware, that doesn't immediately mean that gamers will suddenly lose their ability to discern that GTA27 didn't change anything but a few buildings from GTA26. Were such a situation to occur, gamers would simply stop buying games much like the first Video Game Crash back before the NES.
So, I'll conceed a lot of the author's points, but I come to a very different conclusion. If there was no Nintendo, it would necessary for Sony/Microsoft to create one.
20050710
Fair enough
So I went to a computer fair today. In all honesty it wasn't as spectacular as I could have hoped. When my Dad and I pulled up to the Holiday Inn it was at, he said, "The place isn't mobbed, that's a bad sign."
Overall, I wouldn't call the experience bad, but in the future I should probblay make certain I'm going to a far larger and varied fair.
In any case, the one I attended was very small. The entire fair could have probably fit in my house (even in its current state). As such the variety of goods wasn't spectacular, and so to the options for payment. Whether a function of the lack of competition between vendors, or from some other source, the meaty stuff I really had hoped to save money on was on average fifty dollars more expensive than the best deal online. The only items you could really save money are floppy drives, CD and DVD drives, and CRT monitors. Everything else was pretty pricey.
All I ended up getting was a DVD/CD drive. The floppy and monitor are going to be coming from my aunt's computer which I will surely cannibalize upon recieving it. Since neither is really necessary at the moment, it's not important that I have them now (I'll be borrowing one of my Dad's monitors until then).
So, upon returning home I spent two hundred and seventy dollars at newegg.com on a motherboard, memory, and a processor (with heatsink and fan). For an encore, I won a copy of Windows XP (OEM) off of ebay for seventy dollars. With those in the bag, the only missing components will be the four hard drives I need to do RAID 0+1 with.
For four hard drives from newegg, it'd be another two hundred dollars (ironically, that number doesn't change for twenty, forty or eighty gigabyte sets). Because I know I can do better on ebay, I've turned my focus there.
In any case, the sum total I'll have spent on hardware for this thing will probably end up being five hundred dollars. That's not bad at all. For a non-proprietary system that will kick the rear of any proprietary system (each your heart out Dell), it's darn good.
Most importantly, it'll give me some good experience in understanding the hardware relationships in a computer, and give me something to develop on in the future. The fact that it should be able to play most recent video games decently as well is merely icing on the cake.
Anyway, I hope to someday go to a halfway decent computer fair, which I don't think the one I visited today qualifies as.
Overall, I wouldn't call the experience bad, but in the future I should probblay make certain I'm going to a far larger and varied fair.
In any case, the one I attended was very small. The entire fair could have probably fit in my house (even in its current state). As such the variety of goods wasn't spectacular, and so to the options for payment. Whether a function of the lack of competition between vendors, or from some other source, the meaty stuff I really had hoped to save money on was on average fifty dollars more expensive than the best deal online. The only items you could really save money are floppy drives, CD and DVD drives, and CRT monitors. Everything else was pretty pricey.
All I ended up getting was a DVD/CD drive. The floppy and monitor are going to be coming from my aunt's computer which I will surely cannibalize upon recieving it. Since neither is really necessary at the moment, it's not important that I have them now (I'll be borrowing one of my Dad's monitors until then).
So, upon returning home I spent two hundred and seventy dollars at newegg.com on a motherboard, memory, and a processor (with heatsink and fan). For an encore, I won a copy of Windows XP (OEM) off of ebay for seventy dollars. With those in the bag, the only missing components will be the four hard drives I need to do RAID 0+1 with.
For four hard drives from newegg, it'd be another two hundred dollars (ironically, that number doesn't change for twenty, forty or eighty gigabyte sets). Because I know I can do better on ebay, I've turned my focus there.
In any case, the sum total I'll have spent on hardware for this thing will probably end up being five hundred dollars. That's not bad at all. For a non-proprietary system that will kick the rear of any proprietary system (each your heart out Dell), it's darn good.
Most importantly, it'll give me some good experience in understanding the hardware relationships in a computer, and give me something to develop on in the future. The fact that it should be able to play most recent video games decently as well is merely icing on the cake.
Anyway, I hope to someday go to a halfway decent computer fair, which I don't think the one I visited today qualifies as.
20050709
Helding Hands?
Excuse the grammar of the title, it's intentional.
In any case, anyone who knows me well knows that the closest equivalent to a handheld game console I've ever owned has been those cheapo five dollar games using LCDs to give a player extremely limited options in terms of movement and fighting capability. I haven't ever owned a gameboy, a gamegear, a nomad, a gameboy color, a lynx, or anything remotely important in the grand scheme of the handheld world.
Granted, some time ago I went to the dark side and emulated games like Pokemon, the various Legend of Zelda handheld games (still haven't finished the Oracles, should probably do that sometime) etc. However, I didn't actually own a handheld, and my only interest in owning one would have been in a similar fashion to that of a home console, to play at home.
It's probably because of my Dad's hard work in curtailing my efforts to play the handhelds my fellow young orienteers brought with them. My Dad saw the point of going out to nature as having something to do with nature, and handhelds didn't fit that category despite the first level of Kirby's Dreamland being a woodland area. In any case, I tend to think along similar lines. If you're going somewhere, you're going there to do something you can't do anywhere at all.
So, when I say I'm getting more and more interested in a handheld, understand when I say so that it is in the opposite sense of why people usually want a handheld system. I'm really not interested in taking it places, heck, probably not even on the long, nine hour car trips I take to come home for college breaks. I'm probably likely to use it very much the same way my close friend John uses his Gameboy Advance SP. Just something you play games on at home.
That all said, I've become increasing interested in the latest of Nintendo's offerings, the Nintendo DS. I think what probably has kept me disinterested with the PSP (and most other handhelds for that matter) is that it is a less powerful system than what I already own, for the sake of a feature I'm not really interested in (portability). My family and friends know that I can pack up my GameCube in a few minutes and be ready to go to far places with it so long as a TV is waiting for it. In fact, I can do that almost as readily with a PS2 or Xbox (though they are a little bulkier). Portability really isn't that good a selling point for a person like me.
What really interests me in the DS is the uniqueness of the games for it, because of the uniqueness of the hardware. With two screens and a touch pad, the name of the game is rather different than what you can do with a regular television or computer monitor.
Granted, the games are not wholly unique. The major titles all bear familiar names like "Metroid" and "Kirby". But unlike the PSP and even the GameBoy Advance, I can't say "I played that or something just like it on the SNES/PS1, why should I bother with the handheld?"
Unfortunately for Nintendo, this interest really amounts to a minor longing whenever I see KB Toys' Nintendo DS sitting unsold in the glass case behind the counter. As I'm already going to Japan, building a computer, and most importantly a college student, I can't really afford to plop down one hundred and fifty dollars for the DS, not to mention games. While GameStop sells them used for thirty dollars less, the risk of scratched screens is prohibitive (that's not something I'm used to worrying about either).
As it is, I must prioritize that which is really important to my future. The computer I'm building is necessary for helping me develop games. While it won't be the best on the market, it'll be good enough for me to start using my overimaginative mind to actually make wonderous games of happy joyousness. The DS (while arguably cheaper) is not vital to anything, not even my happiness.
So, no DS for me.
I'm going to a computer fair with my Dad tomorrow, and there'll probably be a horde of video game related junk I'll have to steel myself against buying, including DSs. I'll report back on how that went after tomorrow.
In any case, anyone who knows me well knows that the closest equivalent to a handheld game console I've ever owned has been those cheapo five dollar games using LCDs to give a player extremely limited options in terms of movement and fighting capability. I haven't ever owned a gameboy, a gamegear, a nomad, a gameboy color, a lynx, or anything remotely important in the grand scheme of the handheld world.
Granted, some time ago I went to the dark side and emulated games like Pokemon, the various Legend of Zelda handheld games (still haven't finished the Oracles, should probably do that sometime) etc. However, I didn't actually own a handheld, and my only interest in owning one would have been in a similar fashion to that of a home console, to play at home.
It's probably because of my Dad's hard work in curtailing my efforts to play the handhelds my fellow young orienteers brought with them. My Dad saw the point of going out to nature as having something to do with nature, and handhelds didn't fit that category despite the first level of Kirby's Dreamland being a woodland area. In any case, I tend to think along similar lines. If you're going somewhere, you're going there to do something you can't do anywhere at all.
So, when I say I'm getting more and more interested in a handheld, understand when I say so that it is in the opposite sense of why people usually want a handheld system. I'm really not interested in taking it places, heck, probably not even on the long, nine hour car trips I take to come home for college breaks. I'm probably likely to use it very much the same way my close friend John uses his Gameboy Advance SP. Just something you play games on at home.
That all said, I've become increasing interested in the latest of Nintendo's offerings, the Nintendo DS. I think what probably has kept me disinterested with the PSP (and most other handhelds for that matter) is that it is a less powerful system than what I already own, for the sake of a feature I'm not really interested in (portability). My family and friends know that I can pack up my GameCube in a few minutes and be ready to go to far places with it so long as a TV is waiting for it. In fact, I can do that almost as readily with a PS2 or Xbox (though they are a little bulkier). Portability really isn't that good a selling point for a person like me.
What really interests me in the DS is the uniqueness of the games for it, because of the uniqueness of the hardware. With two screens and a touch pad, the name of the game is rather different than what you can do with a regular television or computer monitor.
Granted, the games are not wholly unique. The major titles all bear familiar names like "Metroid" and "Kirby". But unlike the PSP and even the GameBoy Advance, I can't say "I played that or something just like it on the SNES/PS1, why should I bother with the handheld?"
Unfortunately for Nintendo, this interest really amounts to a minor longing whenever I see KB Toys' Nintendo DS sitting unsold in the glass case behind the counter. As I'm already going to Japan, building a computer, and most importantly a college student, I can't really afford to plop down one hundred and fifty dollars for the DS, not to mention games. While GameStop sells them used for thirty dollars less, the risk of scratched screens is prohibitive (that's not something I'm used to worrying about either).
As it is, I must prioritize that which is really important to my future. The computer I'm building is necessary for helping me develop games. While it won't be the best on the market, it'll be good enough for me to start using my overimaginative mind to actually make wonderous games of happy joyousness. The DS (while arguably cheaper) is not vital to anything, not even my happiness.
So, no DS for me.
I'm going to a computer fair with my Dad tomorrow, and there'll probably be a horde of video game related junk I'll have to steel myself against buying, including DSs. I'll report back on how that went after tomorrow.
Don't look like rain...
As I left for work this morning, I said to my dad I'd both go to and return from work on the power of my own two legs, provided that it wouldn't rain. My dad replied that it wasn't going to rain today. Now my Dad claims his misprediction is the result of his guzzling dry ice-filled water the night before.
I'm not buying it.
In any case, during the period of time I was running home it rained probably 95% of that period. Upon my soggy return home, the rain stopped.
My dad recieved a resounding, "YOU!" when he returned home having failed to find me out in the rain. Apparently he left right before I got home.
Oh well.
I'm not buying it.
In any case, during the period of time I was running home it rained probably 95% of that period. Upon my soggy return home, the rain stopped.
My dad recieved a resounding, "YOU!" when he returned home having failed to find me out in the rain. Apparently he left right before I got home.
Oh well.
20050707
RPT: Akira and Thought
I just finished watching Akira, which is a really excellent example of good anime in my book. Well drawn, strong plot, well timed action, good characters and large gaps in the scheme of things we need to fill ourselves.
In any case, I noticed something extremely odd. For a movie that is only one hundred and twenty four minutes in length, it felt as thought it lasted at least an hour beyond that. Don't get me wrong, I'm not talking about the "When will this movie end?" kind of additional duration. I'm not even certain I've encountered a movie that's done what this one did to me before timewise. The material of the movie was spaced out in a careful manner, there's a lot of time to think about everything that is happening as it happens, and after it happens. It doesn't resemble in any way the Hollywood tendency to throw things at you in thousands a minute. Yet, despite the seemingly liberal, time eating spacing, Akira seemed to get through more material than one could imagine crammed into a two hour film. I started the film at half past ten, and when it was finished I was dead certain that three hours had passed, yet it was only twelve thirty.
This got me thinking.
Actually, the film itself got me thinking, because while one can probably view the film without a shred of thought there's a lot of thinking to be done if any shred of understanding of how it ends is to be gained. So, I was already thinking a lot after the movie (especially because the ending requires it) and it struck me that perhaps therein lay the answer to my puzzlement at the difference of one hour between reality and my perception.
The secret lies not so much in the spacing, but in delivery. Akira must be giving me material at a very appreciable rate in order for it to cover as much as it does, because mystical bubbles projected from the VCR that allow the viewer to only take two hours to complete a three hour movie are still only a dream nerdy geeks like myself come up with. So if there isn't a different there, it must be a matter of delivery. The way in which digestible information in Akira is given lends itself to giving the view more mental breathing room than the "in your face" methods we see in many films today.
Beyond that, the film does actually call upon the viewer to think. As excellent a film as The Last Samurai was, there wasn't a whole lot of thought required to watch it. Not necessarily a bad thing, but notable. Akira requires a lot of thought, although there are times when you stop to simply take in events and action.
The key to it all is really just that, Akira both asked me to think, gave me a lot to think about, and gave it in such a way that I had some breathing room to work with. Because of that I was, not surprisingly, thinking. So, after all that typing, we still haven't come to understand why I thought three hours had past when it was really only two. While we can simply ascribe it to a lack of chronological sense, I think the following is more interesting.
The key is stated above, but in a sense the door that the key unlocks is thought. Because I was thinking, my perception of time changed. Actually, that statement isn't really true. In one sense or another we are thinking all the time, and when we stop we're brain dead or comatose. It wasn't thinking that changed my perception, but the rate of thought which did so.
If you think about it, it's very much akin to film in the old days. A man literally cranked the movie camera in order to film a scene. That man had the power to crank faster or slower, and thus speed up or slow down the scene. The faster he cranked, the more frames per second, the slower the playback would be. In a similar way when our rate of thought increases we perceive a greater number of the infinite moments that constitute time, and in doing so thing slow down.
Now, it is important to note that rate of thought is a complicated thing. While we may not think exactly like computers, rate of thought is somewhat analogous to the CPU's processing speed. The higher the speed, the more instructions a CPU can handle every second. Rate of thought is the same. The importance here is that rate of thought does not mean we're necessarily either thinking straight, or orderly. I could be thinking very quickly in the sense of rate of thought, but if each thought segment is jumbled and disordered the amount of time I'll take to complete my thought or get anywhere pointful with it can still be quite long. In the same way, someone thinking quite orderly and efficiently but with a slower rate of thought isn't hampered by thinking "slowly".
It seems to me that the brain works very hard on maintaining a pretty constant rate of thought. Even when we get tired, we still percieve time at least close to how we do well rested. However, our thoughts become less orderly and our senses less reliable. Yet, the brain still maintains rate of thought and thus perception through all that mess. However, sometimes when we get sufficiently focused (or unfocused, in which case the opposite happens) on something, rate of thought can increase greatly. Sometimes, we look back and wonder how we managed to get quite so much mental gymnastics done in the last hour. More dramatically, I still remember when I had my car accident, and got to read my mom's van's liscence plate over and over again in slow motion as it spun in the air in front of me. While the incredible change of rate of thought there was rather wasted on arguing with myself whether I really had just gotten into the car accident I was witnessing, the point remains that it seems humans are capable of increased rate of thought and thus, a perception of more moments of time.
So, if we're capable of this, why doesn't humanity simply run in overclocked mode all the time? Once again, I've thought about it and came to some conclusions. For computers, overclocking yields faster processing speed. However, it runs down (and even potentially damages) the hardware. This doesn't mean that the computer doesn't run, but it means the lifetime of the CPU and motherboard will be shortened. Similarly, were we to overclock our brains all the time, at the very least we'd grow tired far more quickly, if not wear ourselves out of brainpower before retirement.
This isn't even to mention the fact that at the super heightened brain speeds achieved at such times like when I had that car accident, communication between people breaks down somewhat. Because sound and the rest of the universe aren't speeding up with you, you don't get input any faster. Although thinking quickly, a person can't really talk any more rapidly (despite breakthroughs in radio disclaimer technology). That's not to mention the large amount of wasted perception we spend already, waiting for rendering to finish, coffee to brew, compilers to compile etc. If a watched pot never boils, imagine watching it in slow motion, not boiling.
While there is obvious survival benefit in being able to have plenty of time to see and think about how one might not get mauled by the fierce animal that is attacking, if doing so makes you sleep more (and be vulnerable), makes communication more difficult (it's hard enough to have clear communication between genders as it is), and most of the time isn't terribly useful, it doesn't make sense to do it all the time. This isn't even bringing into consideration the possible chemistry of the situation.
So, while we are capable of super quick thinking, it seems we're set up, at least biologically, to save it for times of need.
Rambling is finished. I sleep now.
In any case, I noticed something extremely odd. For a movie that is only one hundred and twenty four minutes in length, it felt as thought it lasted at least an hour beyond that. Don't get me wrong, I'm not talking about the "When will this movie end?" kind of additional duration. I'm not even certain I've encountered a movie that's done what this one did to me before timewise. The material of the movie was spaced out in a careful manner, there's a lot of time to think about everything that is happening as it happens, and after it happens. It doesn't resemble in any way the Hollywood tendency to throw things at you in thousands a minute. Yet, despite the seemingly liberal, time eating spacing, Akira seemed to get through more material than one could imagine crammed into a two hour film. I started the film at half past ten, and when it was finished I was dead certain that three hours had passed, yet it was only twelve thirty.
This got me thinking.
Actually, the film itself got me thinking, because while one can probably view the film without a shred of thought there's a lot of thinking to be done if any shred of understanding of how it ends is to be gained. So, I was already thinking a lot after the movie (especially because the ending requires it) and it struck me that perhaps therein lay the answer to my puzzlement at the difference of one hour between reality and my perception.
The secret lies not so much in the spacing, but in delivery. Akira must be giving me material at a very appreciable rate in order for it to cover as much as it does, because mystical bubbles projected from the VCR that allow the viewer to only take two hours to complete a three hour movie are still only a dream nerdy geeks like myself come up with. So if there isn't a different there, it must be a matter of delivery. The way in which digestible information in Akira is given lends itself to giving the view more mental breathing room than the "in your face" methods we see in many films today.
Beyond that, the film does actually call upon the viewer to think. As excellent a film as The Last Samurai was, there wasn't a whole lot of thought required to watch it. Not necessarily a bad thing, but notable. Akira requires a lot of thought, although there are times when you stop to simply take in events and action.
The key to it all is really just that, Akira both asked me to think, gave me a lot to think about, and gave it in such a way that I had some breathing room to work with. Because of that I was, not surprisingly, thinking. So, after all that typing, we still haven't come to understand why I thought three hours had past when it was really only two. While we can simply ascribe it to a lack of chronological sense, I think the following is more interesting.
The key is stated above, but in a sense the door that the key unlocks is thought. Because I was thinking, my perception of time changed. Actually, that statement isn't really true. In one sense or another we are thinking all the time, and when we stop we're brain dead or comatose. It wasn't thinking that changed my perception, but the rate of thought which did so.
If you think about it, it's very much akin to film in the old days. A man literally cranked the movie camera in order to film a scene. That man had the power to crank faster or slower, and thus speed up or slow down the scene. The faster he cranked, the more frames per second, the slower the playback would be. In a similar way when our rate of thought increases we perceive a greater number of the infinite moments that constitute time, and in doing so thing slow down.
Now, it is important to note that rate of thought is a complicated thing. While we may not think exactly like computers, rate of thought is somewhat analogous to the CPU's processing speed. The higher the speed, the more instructions a CPU can handle every second. Rate of thought is the same. The importance here is that rate of thought does not mean we're necessarily either thinking straight, or orderly. I could be thinking very quickly in the sense of rate of thought, but if each thought segment is jumbled and disordered the amount of time I'll take to complete my thought or get anywhere pointful with it can still be quite long. In the same way, someone thinking quite orderly and efficiently but with a slower rate of thought isn't hampered by thinking "slowly".
It seems to me that the brain works very hard on maintaining a pretty constant rate of thought. Even when we get tired, we still percieve time at least close to how we do well rested. However, our thoughts become less orderly and our senses less reliable. Yet, the brain still maintains rate of thought and thus perception through all that mess. However, sometimes when we get sufficiently focused (or unfocused, in which case the opposite happens) on something, rate of thought can increase greatly. Sometimes, we look back and wonder how we managed to get quite so much mental gymnastics done in the last hour. More dramatically, I still remember when I had my car accident, and got to read my mom's van's liscence plate over and over again in slow motion as it spun in the air in front of me. While the incredible change of rate of thought there was rather wasted on arguing with myself whether I really had just gotten into the car accident I was witnessing, the point remains that it seems humans are capable of increased rate of thought and thus, a perception of more moments of time.
So, if we're capable of this, why doesn't humanity simply run in overclocked mode all the time? Once again, I've thought about it and came to some conclusions. For computers, overclocking yields faster processing speed. However, it runs down (and even potentially damages) the hardware. This doesn't mean that the computer doesn't run, but it means the lifetime of the CPU and motherboard will be shortened. Similarly, were we to overclock our brains all the time, at the very least we'd grow tired far more quickly, if not wear ourselves out of brainpower before retirement.
This isn't even to mention the fact that at the super heightened brain speeds achieved at such times like when I had that car accident, communication between people breaks down somewhat. Because sound and the rest of the universe aren't speeding up with you, you don't get input any faster. Although thinking quickly, a person can't really talk any more rapidly (despite breakthroughs in radio disclaimer technology). That's not to mention the large amount of wasted perception we spend already, waiting for rendering to finish, coffee to brew, compilers to compile etc. If a watched pot never boils, imagine watching it in slow motion, not boiling.
While there is obvious survival benefit in being able to have plenty of time to see and think about how one might not get mauled by the fierce animal that is attacking, if doing so makes you sleep more (and be vulnerable), makes communication more difficult (it's hard enough to have clear communication between genders as it is), and most of the time isn't terribly useful, it doesn't make sense to do it all the time. This isn't even bringing into consideration the possible chemistry of the situation.
So, while we are capable of super quick thinking, it seems we're set up, at least biologically, to save it for times of need.
Rambling is finished. I sleep now.
20050705
Franklin: Origins of the Sammich
Hai guys! Franklin here with another sordid tale of my irregular lifestyle.
Today I will speak to you about an experience that lead to the introduction of a lunch special called the "Sammich" at the local diner near where I reside. The story can be kind of confusing at times, but I've given it a lot of thought as I work the late night janitorial shift at Joe's House of Pancakes.
It all started when the Technothon 3000.1 came to town. Apparently they mistook my fair and quiet city for some residence of neo-hippies, because as soon as they entered the town the silence was destroyed by something halfway between a cell-phone ring tone and a nuclear holocaust. Judging from the dazed and groggy expressions of the people strewn about the town green, I surmised they were as stunned as I was.
Steeling myself, I attempted to talk to man in charge about this intrusion into my town's peaceful summer. However, the conversation met with great issue, as I was wearing army issue ear plugs and the director was deaf. After excessive amounts of incomprehensible hand gesturing to each other in which I deduced that this man had been born in Japan, raised by rapid sparrows, and then educated at a school where everyone spoke backwards and he in turn decided that my parents had left me encased in concrete for six months, spoon fed me jello for years, and accidentally sent me to college instead of an institution, we decided to get some lunch.
The local diner had been made to be soundproof. This had happened because of the erroneous beliefs of the original owner that if he made his diner super sound proof it would be indestructible by nuclear bombs. As such, it made the perfect place to get away from the terrible din.
Upon entering, it turned out that the director was part of the long lineage of the family of those who were now running the diner. The Japanese couple running the diner were overjoyed, especially because the director's tattoo contained the last instructions needed to finally complete the secret family recipe for the ultimate sandwich. The first round of which cured all the deafness induced by the Technothon 3000.1 going on.
Unfortunately for our town, the diner left with the Technothon, and now makes a fortune curing the deafness the Technothon induces. So the heavenly "Sammich" left us.
I can't really remember where the name "Sammich" came from, as I didn't take my earplugs out until well after the conversation had ended.
In any case, be sure to have some earplugs around the house in case the Technothon comes to your town. I hear it's up to version 3017.7 now, which could be getting dangerous.
Today I will speak to you about an experience that lead to the introduction of a lunch special called the "Sammich" at the local diner near where I reside. The story can be kind of confusing at times, but I've given it a lot of thought as I work the late night janitorial shift at Joe's House of Pancakes.
It all started when the Technothon 3000.1 came to town. Apparently they mistook my fair and quiet city for some residence of neo-hippies, because as soon as they entered the town the silence was destroyed by something halfway between a cell-phone ring tone and a nuclear holocaust. Judging from the dazed and groggy expressions of the people strewn about the town green, I surmised they were as stunned as I was.
Steeling myself, I attempted to talk to man in charge about this intrusion into my town's peaceful summer. However, the conversation met with great issue, as I was wearing army issue ear plugs and the director was deaf. After excessive amounts of incomprehensible hand gesturing to each other in which I deduced that this man had been born in Japan, raised by rapid sparrows, and then educated at a school where everyone spoke backwards and he in turn decided that my parents had left me encased in concrete for six months, spoon fed me jello for years, and accidentally sent me to college instead of an institution, we decided to get some lunch.
The local diner had been made to be soundproof. This had happened because of the erroneous beliefs of the original owner that if he made his diner super sound proof it would be indestructible by nuclear bombs. As such, it made the perfect place to get away from the terrible din.
Upon entering, it turned out that the director was part of the long lineage of the family of those who were now running the diner. The Japanese couple running the diner were overjoyed, especially because the director's tattoo contained the last instructions needed to finally complete the secret family recipe for the ultimate sandwich. The first round of which cured all the deafness induced by the Technothon 3000.1 going on.
Unfortunately for our town, the diner left with the Technothon, and now makes a fortune curing the deafness the Technothon induces. So the heavenly "Sammich" left us.
I can't really remember where the name "Sammich" came from, as I didn't take my earplugs out until well after the conversation had ended.
In any case, be sure to have some earplugs around the house in case the Technothon comes to your town. I hear it's up to version 3017.7 now, which could be getting dangerous.
20050703
Burn the Forests!
The stupid trees in the neighbor's yard behind ours grew over the past years. They perfectly obscure the fireworks, even from the roof.
In any case, I'm going to make my Mom feel bad for leaving me behind. I had reponded, "Yes, I want to come" ever time she asked me over the past week. While the blame for being left behind is not hers entirely, as I didn't properly check to make sure I didn't miss the boat, I think I can justify making her feel bad.
When she left with my brother, she said to me exactly this from down the hall, "Your father needs a tick check." She didn't say that she couldn't do it herself because she was going to the fireworks, she just told me I needed to do it. Seeing as how she'd complained about being tired earlier, I could understand not wanting to try and peer across my father's freckled back to find ticks.
This bypassed my checking mechanism, which worked under the assumption that I would receive word that people were going to the fireworks. It didn't seem that bad an assumption to me. However, I forgot the tendency for people to not inform me of things.
In any case, I'm probably going to try to see some fireworks tomorrow evening after work. Probably with my dad since he didn't go to tonights either. And because he helped me get off the roof (that rickety step ladder is really scary).
Happy Fourth!
In any case, I'm going to make my Mom feel bad for leaving me behind. I had reponded, "Yes, I want to come" ever time she asked me over the past week. While the blame for being left behind is not hers entirely, as I didn't properly check to make sure I didn't miss the boat, I think I can justify making her feel bad.
When she left with my brother, she said to me exactly this from down the hall, "Your father needs a tick check." She didn't say that she couldn't do it herself because she was going to the fireworks, she just told me I needed to do it. Seeing as how she'd complained about being tired earlier, I could understand not wanting to try and peer across my father's freckled back to find ticks.
This bypassed my checking mechanism, which worked under the assumption that I would receive word that people were going to the fireworks. It didn't seem that bad an assumption to me. However, I forgot the tendency for people to not inform me of things.
In any case, I'm probably going to try to see some fireworks tomorrow evening after work. Probably with my dad since he didn't go to tonights either. And because he helped me get off the roof (that rickety step ladder is really scary).
Happy Fourth!
Merry Christmas
KB Toys is having a special sale called "Santa's Summer Sale". Regardless of how the sale works, it meant one thing and one thing only for me.
I got to dress up as Santa.
Admittedly, Santa doesn't usually wear a pair of cheap shades, a hawaiian shirt, and say, "Merry Christmas" in July. However, this was overlooked by just about everyone, including the children.
The job was actually very enjoyable, aside from the wig (try wearing one sometime, not only are they uncomfortable but they are incredibly good at amplifying heat). I got to hand out small candy canes to lots of children, I got a lot of smiles from the passers by, and I got to joke around with customers about the whole crazy situation of Santa in the summertime.
A few highlights:
A high school student pronounced me the "illest" Santa he'd ever seen, and proceded to do one of those "ill" high five hand shakes with me.
Mom came and took my picture. She didn't use the digital camera, so I'll have to scan it in some other time.
Some high school girls had their picture taken with me as well.
The father of a child came to me to relay a message from his young son. We had a brief conversation about how his son was afraid of Santa, and was trying to get over it. He pointed across the way to the food court where, behind the railing, the child was hiding with his mother. We waved and they waved back. I gave his father a candy cane which the man used to entice his son close to me, but we never ended up actually touching.
Those were some highlights. I think I scared away at least five children, and probably gave away three hundred candy canes at least. I said "Merry Christmas" enough times that I can probably go without saying it again for the rest of my life (manager Jeff asked me if I could say something else, and I asked him what else Santa says).
In any case, I find it likely I'll be doing it again tomorrow, as I work from two to seven.
I'm now going to go onto the roof of the house to watch fireworks. Hope you all have a good Fourth!
I got to dress up as Santa.
Admittedly, Santa doesn't usually wear a pair of cheap shades, a hawaiian shirt, and say, "Merry Christmas" in July. However, this was overlooked by just about everyone, including the children.
The job was actually very enjoyable, aside from the wig (try wearing one sometime, not only are they uncomfortable but they are incredibly good at amplifying heat). I got to hand out small candy canes to lots of children, I got a lot of smiles from the passers by, and I got to joke around with customers about the whole crazy situation of Santa in the summertime.
A few highlights:
A high school student pronounced me the "illest" Santa he'd ever seen, and proceded to do one of those "ill" high five hand shakes with me.
Mom came and took my picture. She didn't use the digital camera, so I'll have to scan it in some other time.
Some high school girls had their picture taken with me as well.
The father of a child came to me to relay a message from his young son. We had a brief conversation about how his son was afraid of Santa, and was trying to get over it. He pointed across the way to the food court where, behind the railing, the child was hiding with his mother. We waved and they waved back. I gave his father a candy cane which the man used to entice his son close to me, but we never ended up actually touching.
Those were some highlights. I think I scared away at least five children, and probably gave away three hundred candy canes at least. I said "Merry Christmas" enough times that I can probably go without saying it again for the rest of my life (manager Jeff asked me if I could say something else, and I asked him what else Santa says).
In any case, I find it likely I'll be doing it again tomorrow, as I work from two to seven.
I'm now going to go onto the roof of the house to watch fireworks. Hope you all have a good Fourth!
Subscribe to:
Posts (Atom)