So I obviously was going to watch Battlestar Galactica: Razor when it aired on Saturday...and I did.
Overall, I enjoyed the 2-hours. It was nice to see something new in Galacticaland, since we have such a looooong gap between the end of season 3 and the start of season 4.
My major complaint is that it was too damn short. I think the producers of the show could have gotten at least an entire season's worth of material out of the story of the Pegasus (which is kinda what I was hoping for...especially since the Pegasus didn't show up until halfway through the second season of Galactica...what the heck was going on during all that time? ), but instead they crammed it into 2 hours.
The other (relatively minor) problem I had was with the way the story unfolded: it's basically a flashback (Lee Adama taking command of Pegasus) within a flashback (Kendra Shaw's time on Pegasus under the command of Admiral Cain, during and after the initial Cylon attack)...with another flashback mixed in (Admiral Adama's encounter with the Cylon's experiments on humans). While Admiral Adama's flashback is relatively brief, this makes for some rather disorienting jumps between timelines - coming back from commercials usually required a moment to assess which timeframe the story was in.
But both of these were not particularly disastrous...and they were both entirely redeemed by one single scene that just made the whole thing worthwile....
Firstly, with the references to the first Cylon war we actually got to see the "generation 1" Cylon Raiders (i.e. the model from the original series) in full digital effects glory, which I got a kick out of (you also see the original basestars in the Razor Flashbacks that SciFi aired as well) - we've always gotten to see the original Viper models (since they're on Galactica), so it just seemed fitting to be able to see the other well-known ship from way back.
But the absolute best scene in the entire story was the scene with the Cylons in the generation 1 Raider cockpit - shortly after the Cylons destroyed the Raptor containing the raiding party - and the conversation that unfolds there.
When that scene unfolded (and then immediately cut to commercial)...I was struck silent for a moment.
And then I burst out laughing.
Not a "ha, ha funny" kind of laugh...but a "Oh WOW...did they just do that?" kind of laugh.
Without spoiling too much (hope I haven't too far for those that haven't yet watched it), the scene is a direct reference to the original show.
You're probably saying, "so what?". But keep in mind that up until this point in the new Battlestar Galactica, there have been no real overt attempts to reference the original series. The only references within the show itself to the original series up to this point have been the original Vipers, and a brief playing of the original series' theme during the decommissioning ceremony for Galactica (in the original miniseries), but both have been melded into the show in a way as to be almost hidden.
This particular scene can not be mistaken for anything other than an acknowledgment to the original show.
Which makes the scene just that much more worthwhile and memorable.
I just loved it.
So, anyway: On the whole, Battlestar Galactica: Razor was great to watch.
And it serves as a nice teaser for the upcoming season 4, which is just too damn far away from premiering...ugh.
Today most of that tends to fall into the "Web 2.0" genre and the various technologies associated with it: Fancy words like Rich Internet Applications and technologies such as AJAX, Ruby on Rails, Flash/Flex, Silverlight, JavaFX, and so on...
But a glaring flaw in our mad scramble to implement the latest in "gee-whiz" Web-based applications was made apparent to me after a visit to my parents house last week.
You see, my dad owns a low-end laptop: a Compaq he received for Christmas in '06. It works well for his modest needs, and really there's nothing wrong with it from an overall performance standpoint.
The problem only comes when he hits the internet: Since my dad has fairly basic needs, he only uses dial-up. From my point of view it's not a bad situation for him, either- there's just no need for a super-hyper-fast connection in his case (and I'm hard-pressed to justify the extra expense of broadband to him because of his limited needs). And unfortunately because of this, there is a good deal of useful stuff out there that is nearly out of his reach because of the push for more of a Web 2.0 experience - the constant push to make web sites behave like desktop systems is leaving users like my father behind, and ill-served.
Just as an example (not a "productive" one, mind you, but representative of the problem), trying to view a 30 second clip on a site like YouTube takes an agonizingly long time for him. Heck, any website using Flash (or similar) technology can be nearly impossible for him to use. And browsing his email via the AJAX-enabled web mail client his provider supplies can be a chore, as the interface causes some annoyingly slow response times.
Now before you go thinking he's in the minority, keep in mind that over half of the internet users in the U.S. alone are still on dial-up. Current estimates* put the total internet users in the U.S. at over 200 million. So...that's well over one hundred million people on dial-up.
That's a lot of users.
Whether we like it or not, there is still a huge segment of the (U.S.) internet population out there that still have slow connections. Those users might be better served if we keep things simple, and rely on much less fancy (and perhaps more Web 1.0-ish) ways to present our web sites.
A hundred million people might be happier with us because of it.
* I should note that this link references an year-old FCC stat on broadband penetration (which firstly is way out of date, and secondly everyone knows is seriously flawed by now as well). Current estimates on broadband usage put it near 50%, but I haven't been able to find any reliable reference.
So I finally decided to give it a look recently, and what I see so far seems very promising. I haven't dug too far into the framework yet, but I'm still liking what I see so far.
First, the MXML interface language makes interface construction disgustingly easy. I've spent years working with Java Swing (since the 1.2 days), which is pain to work with at best. Being able to slap together an interface using an XML-like syntax, and being able to tie event handlers to controls in a simple manner is a godsend. (Note: yes, there are XML-based frameworks for making Swing interfaces, but I've never quite found one that worked as well as I would have liked...and certainly not one as easy as MXML.)
And I won't even go into a comparison with developing interfaces for the web...ugh...
This is just my initial impression so far: I haven't created anything particularly useful yet with Flex, but I have some ideas that I'm working on to help me really get into the guts of the framework and see how things work.
My only real complaint so far is not with Flex itself, but the Flex Builder IDE (which is an Eclipse-based system) - it's too damn expensive! This is something that's always kind of kept me away from Flash: the high cost of entry is a big turnoff. Thankfully, the Flex SDK itself is actually free, which means you can slap together a build system and use your own editor if you want. But the Flex Builder IDE make things much easier (as it should).
If Adobe really wants to push for adoption of Flex for web development, they should consider making Flex Builder more affordable for the average developer (if not free). [update: see end of this post]
Aside from that, I'm liking what I see so far. I can tell there are a few rough edges here and there, but I look forward to exploring Flex some more in the near future.
update: Based on the comment below (woohoo! someone actually read my post!) - and confirmed on Adobe's site - it looks like Adobe has cut the price on Flex Builder by half. That at least puts it in reach of the average developer. Of course, I still prefer free, but I'm a bit spoiled by all the open-source Java development stuff I've been able to work with over the past few years. Nevertheless, it's a step in the right direction if Adobe really wants to push for adoption of Flex.
From Jeff (emphasis in original post):
In my previous post, I described how I went through considerable pains to automate the entire Development Environment setup. However, I completely missed the simple fact that I should not have been doing what I was to begin with.
If your "build process" is the F5 key, you have a problem. If you think this sounds ridiculous-- who would possibly use their IDE as a substitute for a proper build process? -- then I humbly suggest that you haven't worked much in the mainstream corporate development world. The very idea of a build script outside the IDE is alien to most of these teams.
Get your build process out of the IDE and into a build script. That's the first step on the road to build enlightenment.
The correct approach for the project I was working on should have been to ensure that the project built outside of the IDE. This is a classic case of solving the wrong problem. Live and learn.
While this is not to excuse what I did, I should note that this particular project was a bit peculiar in many ways, so getting the IDE setup automated wasn't necessarily a bad thing. Had I thought about the situation a little more, I would have eventually realized that I was better off getting a build script established for the project instead.
Even more annoying is the fact that prior to working on this particular project, I had a another project that had a nicely automated build script (several, in fact...all created by me) that could build the entire application with a few short commands (and with only a few key utilities needing to be installed ahead of time). I had the ideal answer staring me right in the face in that project and didn't see it.
Anyway, this is classic case of "don't try this at home, kids." You're better off doing as Jeff says: make the project build outside of any IDE. That is the best way to avoid any headaches.
I've been doing Java development since the 1.2 days. And while I've certainly enjoyed using it (compared to something like C or C++), as I had grown more adept with the language I couldn't help but start to feel that there was something not quite right with the language itself.
I couldn't quite put my finger on it until I started to explore other languages. My first choice was Lisp - mainly due to reading what ESR and Paul Graham both had to say about the language - and I began to see some of the weaknesses of the Java language compared to it. Lisp has an intrinsically geeky nature about it, so other languages really can't be compared to it in a favorable light since it's so different...so maybe this wasn't really a fair comparison to make. So I didn't think much about it, but I did begin to see that other languages had some serious strengths compared to Java.
So then I began messing around with Python. I decided to do a small comparison between Java and Python by working out a small utility application for a project I currently work on - basically a simple GUI application. That's when it hit me: Java is a wordy language. Java is a really wordy language compared to Python: I was able to put together a GUI app in Python with 1/3 of the coding that was required for Java (using Swing, of course...and using it right I should add...).
Of course, a big part of this is the fact that Python is dynamic. I've seen that the dynamic languages have a big leg up on expressive power compared to Java. But it still surprised me how much simpler it was to slap together a GUI in Python compared to Java.
Now this might not have been a fair test, but after reading Steveys' post, it reinforces the argument for me that Java programs tend to have a certain about of built-in, accidental complexity because of the nature of the language itself. This complexity tends to get exacerbated as an application grows. Add to the mix developers that aren't well versed (or perhaps not very competent) with Java, and you have the makings of pure hell...
But I don't want to leave this sounding like Java sucks - while it's not perfect, Java does still work very well as a solution to a large set of problems.
I just see now that it's not the only solution that can work for a given problem. No programming language is a panacea.
I wish more developers out there would understand that.
Run into any good software developer, and they'll tell you the same thing: writing code in and of itself is not difficult. Writing good (concise, relevant, maintainable) code is another matter entirely - it requires a deeper understanding of algorithms, patterns, performance characteristics, and so on.
Just being able to bang out code that works is not enough. Sure, you can get by, but sooner or later it'll catch up to you. The best developers need to have a deeper understanding of Computer Science (and related) topics.
This is one (of many) arguments made in the discussion on getting a degree in CS, but it's somewhat difficult to explain without sounding overly preachy. I'll dig more into the degree debate sometime soon.