Thursday, July 30, 2009

Morning

It's a sunny morning on Thursday, and like before I haven't caught a wink of sleep. Will I be able to look sick enough to get out of work early today as well? I don't know. At least I finally manages to be productive in this particular insomniac binge. 

I'm in a very complex love-hate relationship with my laptop. I like it for being retro in a strangely stylish way. I like it's awesome keyboard that's actually a step above pratically all the competition out there. I like how the thing had been running on battery power since one in the morning and now it's... 7:30 in the morning. Pretty darn durable for a laptop with dedicated graphics card. 
But then there are some serious issues with this machine. Like the irrational behavior of some of the lenovo patched drivers in sleep-wake cycle. Or how my wallpaper disappears whenever I use the battery stretch mode. Most of all, I hate how flaky the ATI driver for the dedicated graphics card on this machine is. It gave me two BSoD last night due to amdkmp driver crash (that Lenovo's been 'working on' since last year at least) and another one as a sort of graphics driver related cascading failure showing the dreaded NMI/memory parity error. The same exact BSoD message I received before my last dell's motherboard fried to a crisp due to faulty die casting of the GPU. I never get any errors when I'm using the intel integrated graphics mode which probably uses 4500HD chipset, but why should I settle for the crappy integrated chip when I paid good money for dedicated graphics solution? If I wanted a laptop that will just run off of intel IGP I would have bought much cheaper, and light laptop... Though to be fair cheap/light laptop with 1440x900 resolution is a rarity these days for some reason. The manufacturers including Apple are still sticking with crummy 1280x800 resolution for ~13in screen solutions. Way behind the times those people. 

Ok, I'll be honest. This laptop's still pretty good running on intel IGP. Things I do for work don't usually need dedicated GPU unit with separate ram. They need processing power, and this 2.5GHz core 2 duo machine packs enough wallop to blast most consumer class desktops out of the water. I'm just pissed that I can't play any games on this machine without risking the whole OS going down in blue flames... To be fair I haven't been playing much of anything these days, and I certainly haven't been playing anything that would actually need the punch offered by a dedicated graphics card, but still, I'd like to keep my options open. In fact, only three reasons stopped me from purchasing a new Aluminum MacBook instead of a thinkpad. Screen resolution, lack of SD card drive, and dedicated graphics solution. Well since the macbooks coming out right now have much better GPU with SD card drives to boot, not to mention phenomenal battery life estimated at around 6~7 work hours, the only thing Thinkpads have going for them is the screen resolution, something that can be managed if you're an external monitor kind of person. 

With the unstable graphics card giving me grief, I keep on thinking about bringing another gadget into my life. Maybe a new netbook (the ones on the market today lasts for upto 10.5 hours per charge). The 701's getting really old and it's a real pain to type up a full report on that keyboard. I can manage, but it makes my fingers feel like I've been playing on the piano for hours. While a new netbook would certainly be nice (especially since even the worst netbook out there can run starcraft on it, thus satiating some of my entertainment needs), I'm not sure this is a good time to buy a new system though. The Nvidia ION is just around the corner and there is the disturbing rumor of the Apple tablet coming out as early as September or possibly this winter season. 

Oh yes, the Apple tablet. People had been dreaming of it for a few decades now, ever since the Newton died. If Apple pulls it off there's a very good chance that I'll end up with one of those things, especially considering the wealth of science applications on iTunes Store at the moment. Some of the applications like the Papers are a godsend to anyone in academic profession. And I know for certain that Drew Endy et al are planning an iPhone-OS based mobile version of the biobuilder platform, which is a beginner friendly yet heavy duty synthetic biology CAD program that integrates into regular computer based distributions... Yeah, even speaking without gadget lust there's a good chance I'll get a touch or a tablet in the near future, since my professin almost seem to require having it for some reason these days. Kind of understandable when you think about it. The last time academic profession saw some mobile platform that was reliable and consistent enough for field/lab deployment was close to ten years ago, when the term PDA was new and Palm ruled the Earth.

On the other note (what are rant posts without multiple number of topics to dazzle the readers' minds?), only 95 days until Nanowrimo. I'm definitely participating this year, with my trusty laptop and all. I even have most of the rough draft and settings lined out in clean text based wiki format. I didn't expect myself to be able to come up with such awesome ideas, but I think I might have hit the real jackpot. I haven't read anything even remotely close to it for years. Very hundred-years-of-solitude-y. With some undeniable influence from all the Japanese light novels I've been force fed over the years. 

I'm really looking forward to it.

Well, time to get to work!

Writer's Block: Technology’s Impact on My Family

Wednesday, July 29, 2009

Rainy day

Thanks to the wooziness induced by the late night last night, I was able to get off from work way early today. If this can keep up with my schedule I might as well sleep late every day. It's good to be outside and free when the sun is shining, except that it's not quite the case right now.

After days of half-formed rainstorms that only lasted an hour or two, damning the whole of the city into the pre-rainstorm humidity and heat that would make Tokyo proud, it's finally pouring down. I don't know whether to feel happy or sad about this. Certainly I've been waiting for a decent rainstorm for a while now, with thunder and lightning. But why does it have to be the day I could have taken my laptop out to the park to get some personal workspace? The world works in really strange ways.

With the rain, and with my brain still a little soggy from lack of sleep and rest, I just came back home right away instead of hanging around the city to do whatever. I could have spent some much needed (and decidedly cooler) time in bookstores in the area, but I didn't feel up to it. Maybe it's the weather.

So now I'm sitting in the sofa in my room, looking out the window being riddled with raindrops, wondering what to do with this unexpected freetime. I've already read most of the books in my personal library a few times. There might be movies in hard drives that I could be watching but I don't like being so passive when I'm feeling tired and under the weather. Yes, I'd rather act opposite of my mood and condition. Otherwise there's no end to the depths I might fall to.

Maybe I can try playing some games? I've already burned through my collection of Deus Ex mods couple of times before, so that's rather out of the question. I don't feel like exploring synthetic biology right now, since while I'm looking for something involved, I don't want to wreck my brain over other stuff, just not right now. Maybe I can look into some mmorpg options? Like one of those free to play games that are all the rage these days.

Online games are one of those interesting things in life that has just so much potential to be awesome, but never is. It's like looking at a seed that continuously ends up dying instead of blooming into the amazing flower we were all promised. Take a look at the .hack// franchise on the playstation consoles for example (actually now that I think about it they only came out for PS2, with final one being promised for PSP). Now THAT's how the mmorpg games should be. Except that .hack// games aren't mmorpgs. It's what they call a simulated mmorpg with simulations of real people populating a virtual server that exists within the game. The game even has a virtual operating system with virtual web browser and virtual email client, with unreal people sending you email during your virtual off-time. The premises sound weird, but it works well in practice, and the franchise continued for close to a decade with one awesome anime series acting as prequel to the game (the game spanning 4 DVDs, with sequel of 3DVD lengths) with not-so awesome other things populating the marketplace (actually, one of the light novels based off the franchise is quite good. AI BUSTER 1 and 2, I personally prefer the second one). Maybe the whole faux-mmorpg setup only works precisely because none of the stuff is real. They are all made-up, make belief people living in make-belief world (oh wait, did I just describe the heart of 'real' mmorpgs as well?).

As Bernard Shaw himself have said before, hell is other people's company. This game can probably better explain the Japanese fixation with androids than any number of academic thesis out there.

Well, I think I'll stop writing for a moment and seek out some interesting mmorpg to waste time on.

Late night or early morning?

This isn't good. It's four in the morning and I still can't get to sleep. I'm currently running some good lounge music in the background with the evil alchemy of the internet radiostation, courtesy of the smoothjazz. Thinking about things like diybio, synthetic biology, artscience and upcoming nanowrimo competition, which I plan on participating this year. Only 96 days left to go. I'm thinking of telling bunch of my friends to register just to see if they can actually do it. Yes, even the ones that can't write artsy-creative stuff to save their lives.

The filter for the ac might have gone bad. My throat feels sore, but if I open the window the room will get hot again, meaning more sleepless agony for me. Listening to the music and looking outside the window at all the blinking buildings in the main city down the broadway, the whole scene reminds me of setup for many classic Japanese scifi-futurescapes, of the kind that can be seen in the games like the snatcher (if you haven't played it yet, don't call yourself a gamer) and its sequel policenauts, both my favorites.
 
 




Picked this picture of an apartment in Seattle from a blog of one of my favorite otakus on the web. How long will I have to live to be able to have a view like that outside my own apartment window? Hopefully this science gig will work out better than it is now... I've never been much of a home person. I prefer cool apartment in high places surrounded by pretty lights of the city over any house any day of the week. There's some quality about those architectures that's really appealing to me... The prices of the apartments in the city are dropping across the board. Maybe I should shop around for the time when I finally get my degree and become a more or less productive member of the society.

It's always interesting that the impression I get from such semi-futuristic landscapes tend to be nostalgia of some sort.

I'm nostalgic about the future.

Tuesday, July 28, 2009

Another workspace pics. And other things.

As usual my blog is running a late night double feature, like how the old theaters used to do it. Or will it be a triple feature?

Here's a link to another collection of workspaces, this time workspaces for science fiction writers. All of them are quite well known. Some of them are even known to me. Although I'd have loved to be able to see actual workspace of William Gibson/Neal Stephenson/Warren Ellis with their computers as well. For reasons explained elsewhere I really dig that kind of stuff. I think Neal Stephenson is the person who taught me to take the Apple platform seriously way before it was cool to be Apple (writing this takes me back. In the past there was a time the Mac OSes were horrible systems with windows based computing platforms being the operating systems of the future. People would always get into a fuss about how the public school system was failing the children by letting them use Apple based products while the rest of the world ran on windows. They were so naive back then).
Like I guessed, writers certainly live in a whole lot of clutter. Most of them are surprisingly clean though, even when counting the fact that most of them probably cleaned up a little before the scheduled photoshoot... It's the same with research labs actually. Kid, I'm speak this from experience, so listen up. While everyone out there will tell you that a well-organized workspace/rooms etc are essential for productivity you should see the workspaces/rooms of the most brilliant people in arts and sciences. Trust me, none of them are capable of maintaining a clean room on their own. There's always some kind of mess, some kind of clutter. Ever looked at desktop of Albert Einstein? The thing is like a maze. And, I too have some clutter issues when I'm running large private projects that span months at a time. It's only that I try to clean everything up and keep them clean when I don't have anything long-term running out of my own place (I once covered a whole wall with post-its for notes and plans/numbers for my thesis (of  sorts). My then-room was in a truly crazy state back then). However, despite the clutter the workspaces of people who actually work on things tend to have some weird method to their madness. For example, it's rare to see actual 'filth' among the clutter. Sure, there are notes, pieces of papers, books and gadgets everywhere. If the person is in laboratory oriented profession perhaps even some reagents. But never filth. No half-eaten food rotting away, no weird yellow/brown stuff of mysterious origin. All the clutter is information, all of them information vital to whatever he/she is doing. Food isn't information and it's not vital to finding out some new law that governs high energy plasma. Or writing science fiction. Or designing proteins to save human lives. So yeah, if you walk into a work/room of a person and smell rotten food all over the place, the chances are he/she isn't working. Just being lazy and wasting time. But if you walk into a work/room and find crazy amount of papers and scribbled pieces of stuff everywhere, don't touch anything. Those people get stuff done.

Here's a July system guide from Ars Technica aimed at building gaming machines. Even the 'value' gaming machine on here (~$900) is effectively futureproof. You'll be running contemporary games four years from now on with that kind of machinery. you can add some more oomph with careful attribution of either 4 or 8 core processors into the machine, with 16GB RAM or more. But then that would be overkill. Not only would such machine be future proof, it will be on equal standing with some of the heavier single semi-supercomputers in some labs, the kind used for rendering in-house protein calculation. Of course, machine like that will guzzle electricity so anyone who can run that kind of machine for four years is probably very rich or don't pay his/her own utility bills.

Despite the fact that most of my computing needs these days run around mobile solutions systems like that are very tempting to build. Just imagine the things I would be able to do with a graphics card with 1GB DDR3 dedicated memory with all sorts of crazy shader appliances. Not just games, mind you. With upcoming frameworks like CUDA it would be possible to offload computing intensive processed to GPU instead of running them straight out of CPU, in fact turning them into mini suprecomputers, at least compared to the puny units of our current generation. Even laptops might be able to run some serious number crunching once the system's perfected. 

Monday, July 27, 2009

Late night. What to do?

Every so often I'm faced with a conundrum.

It's late night, and either I have something I need to finish before the sun rises, or I'm midst of some strange problem that just won't let me sleep, both mentally and physically. I would normally get some work done in situations like that, but for some reason I can't. There's something in my mind that just stops me from functioning normally, as if some pebble got caught between the cogwheel of my mind. I can feel the urge to do something building inside myself but I can't channel it to something more useful, the energy just disappearing like anything else that follows the course of slow, painful thermodynamic dissipation in this universe. (that makes me think. It would be so interesting to be able to come up with a model that describes human creativity as a function of the thermodynamical mechanism in the universe.)

When I'm faced with such difficult situations I usually try to do something that doesn't require much coherence yet still need some kind of input from myself. And over the years I've found writing (and sometimes drawing) to be the perfect solution for those late night blues... I also play a bit of violin (just picked up a new one a few weeks ago, in fact), but that's a difficult hobby to have in the city where the walls between the apartments are usually thin enough to be punched through (though it isn't nearly as bad as the situation in Japan). 

I've picked up a few useless skill over the past few months as well. Did I ever write here about how I never learned to touchtype and how my friends were always giving me weird eye (living around geeks and geekettes have that side effect)? Well I've learned to touchtype about a few weeks ago, roughly around the same time I got my new violin. It only took me about a day or two to memorize the layout of the keys, only to be expected I guess. Considering how I lived with a computer for half my life. The rate at which I got used to writing on the keyboard without using the hunt&peck approach surprised myself a bit however. Right now I'm writing this without looking at the screen. That''s right. I'm writing this while I'm looking out the window of my room, without looking at the screen or the keyboard. Who would have thought it? Learning to type completely blind in course of a week or two. 
I still need to get used to the keyboard though. I still make some odd typos and my wpm isn't all that high. Average at best. It's something I really need to work on considering the volume of writing I do on everyday basis, both for pleasure and for work. 

When I'm writing things like this, all alone in my room sitting on my couch, I always play some kind of music. In fact, I can barely remember the last time I went on without playing some kind of music around me. The ipod is plugged into my ear practically every single moment I'm outside, and whenever I'm home I play a music on the speakers on my laptop or when it's late at night I use wireless headphones that plug in to the speaker port of my computer (I only use bluetooth for syncing my cellphone with my computer for some reason). Of the terabytes of data I'm sitting on vast majority of the space is taken by music from all over the world, across all sorts of genre. I have Bach, Mozart, and Beethoven, all representing their own era. I have some rock, some of them the harder variety. I also have crazy collection of jpop compilations and singles, and I have many of them in form of original cd sitting in some storage space in the city, since it was way too impractical to bring them with me in my frequent moving binge. I regularly buy musics from promising bands and composers, like the OST/inspired album for Neotokyo. It makes me look like some sort of freak in this day and age where people my age doesn't quite seem to buy anything if it's available in digital format. 

Music must be one of the most fundamental invention of the humanity. Perhaps the invention of the music is the event we can clearly mark as the moment of divide between human the homosapiens and human the semi-ape. It's logical, yet impulsive. It's formless, yet the sytem that makes music come true can be observed all across the world, across the universe in weirded places, like the shape of galaxies, pulses of the stars, or patterns of moss in a forest. Music is very mathematical in that regard, and it is probably no surprise that expertise in one usually accompanies the other... There are some people who say arts are too different from the sciences for them to coexist together, but I tend to think it's only a method to cover for their own incompetence. All the greatest artists in the human history had been scientists in one form or the other, and this pathetic division that forces a child to choose between a path or arts or paths of sciences is a freakish accident of social nature that had nothing to do with the arts or the sciences themselves. I say this a lot these days, but really. One day, the future generation will look back at the state of arts and sciences today and laugh or be horrified at how crazy and irrational it all is...

Well I think I'm through venting for now. Gotta get back to work for the day ahead. 



Sunday, July 26, 2009

It's almost annoying-- And pretty pictures

I keep on writing double topic posts on this blog for some reason. I think it has something to do with how it's ridiculously difficult to concentrate on something these days, with the weather, the financial situation, and very weird family matters I shouldn't even be worrying about at this age. 
-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

How hard it is to blog properly these days. I mean, sure it's easier than ever to type things up either using my notebook or the blackberry and publish it all right to the net, but it's just way too difficult to write a blog post with properly thought out reasoning and half-decent grammar. The problem is coherence. It's getting more difficult to write things that are coherent. Without coherence within reasoning behind the writing I might as well let my python script do the talking by linking together random words from a dictionary (now that I think about it, that might be fun. Should try it on another blog).

Due to the difficulty of writing lengthy yet coherent pieces of writing I've missed a lot of opportunities for some good posts. New developments in technology like growing of a whole rat from its iPS cell culture from another adult rat (with some issue, but that's only to be expected), or protein-induced pluripotency within cells (the actual paper I have yet to read), or even the hypothesis that emergence of life might be hardwired into the complex system that is the universe (which is something I've suspected for a long time, but this is probably the first time it's been capitalized in a popular science publication). Don't even get me started with the plethora of amazing TED talks out there that I'm just dying to share with you all.

This is one of the most annoying thing in maintaining a personal blog. Am I a content creator or am I just copy-pasting cool news of other people's accomplishments into a digital medium for further copy-pasting, like how it is with most tumblr accounts (with some notable exceptions)? I always to try to write my own stuff but then the product of such creative exercise rarely if ever looks as exciting as the discovery of quorum sensing, new take on complexity sciences, or new developments in synthetic biology...

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

On the other note, here are some interesting pictures of other people's computer workspaces. I think I should post my multiscreen setup here sometime in the future as well.



















 
There are more at this webpage. Looking at other people's workstation setup is always fascinating for me. I guess it's a kind of technofetishism/infornography that's so common these days. I know a number of people who maintain elaborate workstation environments and give them lavish names like 'the cathedral' and such (I'd much prefer the term temple or a library, but each to her own I guess). And while I don't operate anything as fancy as that except in my lab, which doesn't count since the hardwares in that place don't really belong to me (8-core with 16GB RAM, 3 screens, wowowiwa!), I understand what they are going for. With the society being built increasingly around the engines of information we call 'com-pu-ter' it's become an essential feature of any semi-viable household. All of my friends think that while it's possible to live without TV, it would be impossible to live without the access to the internet and some sort of computing device. Even the non-techie ones who can't tell the difference between Java and C++. It's a shielded environment where one can fulfill both the functions necessary to life (earning a living) and functions necessary to keep the mind alive. Through education, fun, contact with other people, and just plain-ol time wasting. So the swordsmen take meticulous care of their swords, providing lavish casing and decoration of highest materials for their tools of trade and mental compass. So many of us do the same for computers.
I would love to be able to set up a beautiful workstation like that in my own house, but it is a little difficult at the moment. I move around frequently, both in terms of going around the city for jobs and moving to another place of living for whatever the reason. So my main computer had been a laptop for a long time. And since I can't seem to completely give up the computer gaming side of myself (well, console gaming as well, with PS2 and NDS-lite, but haven't really played them for... Months) all of them were light-yet-workstation class machines with dedicated graphics solution. 

Even with mobile computing, however, I still maintain something very close to what those people do with their physical workstation. In the real world I like to keep my desk area meticulously clean. Just some spare USB cables for my netbook/ereader/blackberry connection, my external HD solution totalling at close to 2TB storage space, a lab notebook (paper), and my laptop. That's it. The rest is white and wood. No carpet, no dust. It's really wonderful. You'd be surprised to know how much clean workspace contributes to productivity. 
As for 'pimping out' it's usually all in the computer. Instead of buying new exotic figurines or lighting fixtures for workspace like some other people I stick with software side of things. I run custom theme that looks cool, clean, and eats through less memory than the default vista theme. I have personal organize software running on my sidebar as a separate application instead of running windows supplied sidebar, which is, while nice in functionality uses too much memory and is a possible security risk. I am also very careful about choosing my desktop background image. Being pretty isn't good enough to be chosen as my desktop background. It needs to have certain aesthetic quality that works with rest of the software platform. I'm currently running a 3D simulation of human brain neurons as my desktop background and it fits in with all the rest of the computer and my work applications perfectly. It's like the whole thing's made with each other in mind. And the desktop's just the beginning. I also pay significant amount of attention to my web browser, which is probably between the first and the fourth most used application on any computer I use. Choosing a web browser is a really complex, sometimes draining process. Not only should I be aware of the kind of aesthetic look inherent to a browser, I also need to consider their technical capacity and memory consumption. Since mobile computing is a big part of my life I really need to watch the memory and processor power consumption on all my applications. I can't have my machine run out of juice just when I'm about to deliver that paper I've been struggling with three months, you see. Web browsers serve all sort of purposes for me. It's a banking terminal. It's a programming tool. It's an entertainment machine and a terminal to a different world. 

At the moment I run three web browsers on my computer. Opera 10b with Opera Unite service activated (more on that later), Firefox 3.5 with greasemonkey and all the necessities, and Google Chrome. I just can't seem to figure out which browser I like the best, but the default browser on my computer remains Firefox for its wider compatibility. Opera 10b is something of a mixed bag. I think I can write a few things that really needs to be improved with the browser but overall the build is very tight, with all sorts of different functionalities and widget availability that makes this browser feel like a separate operating system independent of windows vista it runs on. I'm also in love with the Opera Unite service that turns any instance of Opera browser into a personal webserver with configurable programs/services you can download directly off the net. I can see where this service is going and I like it. Google Chrome is something of an oddball. I liked the browser so much that I briefly used it as my default browser. It's the fastest one of the bunch and you can certainly feel the speed difference compared to all other web browsers. It's secure with the whole sandbox mechanism, perhaps even more so than other web browsers on the market. Google is working on all sorts of crazy projects to increase the functionality of the browser, and it already had significant amount of improvements built into it. Yet the interface remains minimalistic with most of the 'gears' hidden beneath the clean shell some people think is 'too clean'. I like it. It does everything I would ever want from a web browser, and it's open source with full might of Google standing behind it, meaning it's going to places with new and innovative technologies. The problem is with the memory and processing requirements of the browser. It slows my brand new laptop to a crawl when left on for a whole day or two, which is something I usually do with my computers. Opera and Firefox so far doesn't seem to suffer from that problem.

Writing about all these things makes me feel like a geek, or an otaku of sorts. Definitely vast majority of people out there usually don't bother with theming their operating system or figuring out the perfect color sheen of the desktop wall paper or worry about ACID3 test results on their browsers. I guess I am a semi-otaku of sorts. Otaku meaning person obsessed with information, may it be about newest anime or computer technology, biotechnology or robotics. Infornography seem to be the description of how otakus treat information... More on that later.

Saturday, July 25, 2009

Wake-up call. Change the world.

Just a rough draft of something I've been thinking about a lot lately... It's good to be able to do some draft publishing before releasing things as full version. Normally I would do this kind of thing on my handset, but why type away on the miniscule keyboard when I can write in comfort of my own laptop, courtesy of the free wifi access points throughout the city? (which is truly marvelous. Not that many major cities in the world offer muni-supported wifi access points. Like Japan for example. Those people are obsessed with getting paid for letting people browse on their wifi spots) 

As usual, I'm busy with all sorts of studying and jobs to keep myself alive. If I've realized one thing abut myself over the course of the years, it's that I count curiosity and pursuit of ever greater 'stuff' of the world to be an integral part of the human existence. I probably can't live without being able to learn more things and step closer and closer to the edge of the world. Sure, food and shelter are about the only things a human being needs to survive directly, but if that were the case the ideal lifestyle would be being locked up in municipal insane asylum, wouldn't it? Freedom of mind and body is just as integral to a living existence as much as immediate nourishments and protection from elements of the world. It sounds obvious when laid out like this but there are surprisingly many people who think otherwise. People with power who effect lives of other people. Goes to show how sane this world is, doesn't it?
 
I've been looking into more of the synthetic biology stuff, making use of the relatively ample free time made available to me during the summer. I think I'm beginning to come up with a tangible idea and time line for the impending DIY-bio artificial/synthetic cell project. I still don't know if the other members of the NYC group will approve, all I can do is to work on the stuff until it's just as realistic as getting off the couch and going out for an ice cream. As long as I keep the target relatively simple, like having functional DNA snippets within an artificial vesicle, it might work with standard BioBrick parts... Just maybe. 
 
I've been watching some old hacker movies lately. Or should I say that a friend of mine had  been having a screening of sorts for the past few weeks? And I just can't believe what kind of cool things those movie hackers were able to pull off with their now-decades-old computers and laptops. Computers with interfaces and hardware that exudes that old retro feel even across the projection screen. I know a lot of people with brand-spanking new computers with state-of-the-art hardwares and what they usually do, or can do with those machines aren't as cool as the stuff on the movies being pulled off with vastly inferior hardware and network access. Of course, like everything in life it would be insane to compare the real with the imagined, and Hollywood movies, especially the ones made during the days when computers were still new and amazing pieces of specialty gadget, have a bad tendency to exaggerate and blow things out of proportion (I'm just waiting for that next dumb movie with synthetic biology as a culprit, though it might not happen since Hollywood's been barking about decency of genetic engineering technology for over a decade now). Even with that in mind, I can't help but to feel that the modern computerized society is just way too different from the ones imagined by artists and technologists alike during those days. 
 
Ever heard of younger Steve Jobs talking in one of his interviews? He might have been a bastard but he certainly believed that ubiquitous personal computing will change the world for the better. Not one of those gradual, natural changes either. He actually believed that it's going to accelerate the advancement of humanity in the universe very much like how Kurzweil is preaching about the end of modernity with the upcoming singularity of technologies. Well, personal computing is nothing new these days. It's actually quite stale until about a few months ago when people finally found out glut-ridden software with no apparent upgrade in functionality were bad things, both in terms of environment and the user experience. Ever since then they've been coming out with some interesting experiments like the Atom chipset for netbooks (as well as netbooks themselves), and Nvidia Ion system for all sorts of stuff I can't even begin to describe. And even with the deluge of personal computing and personal computing oriented changes in the world we have yet to see the kind of dramatic, real, intense change we were promised so long ago. Yeah sure, the world's slowly getting better. It's all there when you take some time off and run the real numbers. It's getting a little bit better as time goes on, and things are definitely changing like some slow-moving river. But this isn't the future we were promised so long ago. 
 
We have engines of information running in every household and many people's cellphones right now. What is an 'engine of information?' It refers to all sorts of machinery that can be used to create and process information content. Not just client-side consumption device where the user folks money over to come company to get little pieces of pixels or whatever, but real engines of information that's capable of creating as well as consuming. It's like this is the Victorian Era, and everyone had steam engine built into everything they can think of. Yet still nada. Nothing. Zip. The world's rolling at the same pace as before and most people still think in the same narrow minded little niches of their own. What's going on here? Never had such a huge number of 'engines' beyond the expansion of the humanity in history been available to so many people at once. And that's not all. Truly ubiquitous computing made available by advances in information technology is almost here, and it is very likely that it will soon spread to the poorer parts of the world in similar fashion as is with the large cities of the G8 nations. 
 
But yet again, no change. No dice. Again, what's happening here, and what's wrong with this picture? Why aren't we changing the world using computers at vastly accelerated rate like how we changed the world with rapid industrialization? That's right. Even compared to the industrialization of the old times with its relatively limited availability and utility of the steam engines we are falling behind on the pace of the change of the world. No matter what angle you take there is something wrong in our world. Something isn't quite working right. 
 
So I began to think during the hacker movie screening and by the time the movie finished I was faced with one possible answer to the question of how we'll change the world using engines of information. How to take back the future from spambots, 'social media gurus', and unlimited porn.

The answer is science. The only way to utilize the engines of information to change the world in its tangible form is science. We need to find a way to bring sciences to the masses. We need to make them do it, participate in it, and maybe even learn it, as outlandish as the notion might sound to some people out there. We need to remodel the whole thing from the ground-up, change what people automatically think of when they hear the term science. And tools. We need the tools for the engine of information. We need some software based tools so that people can do science everywhere there is a computer, and do it better everywhere there is a computer and an internet connection. And we need to make it so that all of those applications/services can run on a netbook spec'd computer. That's right. Unless you're doing serious 3D modeling or serious number-crunching you should be able to do scientific stuff on a netbook. Operating systems and applications that need 2GB of ram to display a cool visual effect of scrolling text based documents are the blight of the world. One day we will look back at those practices and gasp in horror at how far they held the world back from the future. 
 
As for actual scientific applications, that's where I have problems. I know there are already a plethora of services and applications out there catering to openness and science integrated with the web. Openwetware and other www.openwetware.org/Notebook/BioBrick_Studio.html">synthetic biology associated computer applications and services come to mind. Synthetic biology is a discipline fundamentally tied to usage of computer, accessibility to outside repositories and communities, and large amateur community for beta testing their biological programming languages, so it makes sense that it's one of the foremost fields of sciences that are open to the public and offers number of very compelling design packages for working with real biological systems. But we can do more. We can set up international computing support for amateur rocketry and satellite management, using low-cost platforms like the CubeSat. I saw a launching of a private rocket into the Earth's orbit through a webcam embedded into the rocket itself. I actually saw the space from the point of view of the rocket sitting in my bedroom with my laptop as it left the coils of the Earth and floated into the space with its payload. And this is nothing new. All of this is perfectly trivial, and is of such technical ease that it can be done by a private company instead of national governments. And all the peripheral management for such operations can be done on a netbook given sufficient degree of software engineering feat. There are other scientific applications that I can rattle on and on without pause.... So why isn't this happening? Why aren't we doing this? Why are we forcing people to live in an imaginary jail cell where the next big thing consists of scantily clad men/women showing off their multi-million dollar homes with no aesthetic value or ingenuity whatsoever? Am I the only one who thinks the outlook of the world increasingly resembles some massive crime against humanity? It's a crime to lock up a child in a basement and force him/her to watch crap on T.V., but when we do that to all of humanity suddenly it's A-OK?
 
We have possibilities and opportunities just lying around for the next ambitious hacker-otaku to come along. But they will simply remain as possibilities unless people get to work with it. We need softwares and people who write softwares. We need academics willing to delve into the mysterious labyrinths of the sciences and regurgitate in user-friendly format for the masses to consume, with enough nutrient in it that interested people can actually do something with it.
 
This should be a wake-up call to the tinkerers and hackers everywhere. Stop fighting over which programming language is better than what. Stop trying to break into facebook accounts of whoever the snotty-nosed brat is. Get off your fat sarcastic asses and smell the coffee.
 
Get to work.
Change the world.

Posted via email from bookhling's posterous

Tuesday, July 14, 2009

VirtualBox and other things

These days I see less and less utility of full sized laptops as a media tool, since it is technically possible to do most of the basic multimedia and creative works on a well-equipped mobile. I've been watching quite a lot of youtube videos using my blackberry which is surprisingly usable despite its limited memory and processing power. This is with a mobile that's already outdated and in process of being phased out, so the experience is only better on higher end handsets coming out these days... It's a little weird to have access to all the online research papers, video journals and dubbed animes (subtitles will turn me blind on this screen), not to mention IRC and instant messaging clients on the palm of my hand. The ubiquity of information is addictive when you're sufficiently exposed to it and that is a lesson the mobile/connectivity corporations should do well to remember. (That makes me think, the mobile market trend today might have turned out a lot differently if some innovative company could be in charge of both communications infrastructure and handset design, or even, handset design without having to worry about the politics of communications infrastructure. I played around a bit with virtualbox this afternoon. Usually I run vnc within windows to access alternative platforms but thought it would be interesting to be able to run a minimalistic linux/UNIX os within windows installation. I used #! linux, and the whole process took about 5 minutes. It's remarkable how simple it is compared to only a year or so ago. The virtualbox still has a bit of issue in resolving graphics driver that displays resolution higher than 800x600 so I had to do some manual tweaking that involved hosing down one system installation and reverting back to a previous session and reinstalling guest permissions/editing xorg.conf, but compared to some other installations it was all a minor hurdle. ...
More stuff on the state of minimal cell research.
Sent via BlackBerry by AT&T

Posted via email from bookhling's posterous

Monday, July 13, 2009

It's funny, and somewhat unfortunate that I keep choosing two different topics to cover in a single day's post. Perhaps this type of indecision is the reason behind the stagnant state of my wordpress blog, which tend to demand a bit more quality and coherent thought compared to my livejournal? 

I've been watching some Hunter S. Thompson biography materials on the net for the last few days, including a biography tv show on Hulu. An interesting person who gave birth to the style of writing we now refer to as gonzo... Basically a subjective, free form exercise in journalism unrestrained by traditional format. While some might cringe of subjective journalism but then what journalism is truly objective? When you get right down to it the difference is in the language used, and gonzo style journalism never makes any pretense towards their own objectivity. The technique of allowing the reader to gain a first person account of the experience in question was revolutionary in its time and it permeates throughout all sorts of different medium today, starting from the faux-reporting seen in Warren Ellis' Transmetropolitan series, where the whole of the comic was more or less written through the eyes of Spider Jerusalem who was probably modeled after Hunter S. Thompson.

Compared to all the psychos and sickos out there Thompson certainly maintained certain method to his madness until the very end. It would have been really interesting to see what kind of things such character can do given the technological tools of the future/transhumanism. Maybe he might have ended up blowing his head off all the same due to psychological burdens?

I've been having some rekindled interest in Lovecraftian writing recently, mostly due to my little toy project of making a python based program that churns out random, endless stories drawn from expressions in its database. I call it the Monkeyshaker 1000, from an acquaintance's suggestion that 1000 monkeys typing randomly into a typewriter might really end up producing a Shakespear. I've been thinking of all sorts of different things for the program to draw upon and create, and the answer's one of the two.

1)Scientific literature that draws on official (meaning verified, unlike the heap of steaming #$%! we call wikipedia) databases on the net to produce comprehensive reports on rather meaningless, machine dictated topics.
2)Creator of cheap knockoff novella, the kind of stories people commonly refer to as the dime store novel. Such generic novels for entertainment (paperbashing?) usually follow such vapid structure and vocabulary that I don't see much difficulty in making a program to churn out (albeit rather peculiar) pieces of short writing.

I think I'm going with number two. It is decidedly much easier than the first approach, and I already have a cool database to draw upon. The license-free works of Lovecraft. I just wonder what kind of peculiar roman the computer program will be able to come up with using a database full of antediluvian references. Maybe I can title the resulting piece as a result of gonzo journalism in an Lovecraftian universe written by some haywired android.

On the lighter note, there's a science competition going on over at the spacegeneration aimed at anyone under age of 33. You are supposed to come up with a novel method for stopping possible asteroid strike of the Earth using currently available technology or the kind of technology that can be reasonably developed in the future. Novel meaning ingenious. Not another crappy knock off of 'building a superweapon' or 'nuke it' or 'shoot a linear canon nuke' crap that every other one billion and one people proposed already with detailed drawings and technical requirements. Something really new and scientifically feasible. The contest is obviously aimed at students just starting off their interests in space engineering and astronomical sciences, so they might be willing to overlook some of the more incredible ideas, but they are still looking for something worth presenting at the science congress they are having in Korea later this year. 

Maybe I should stop by Korea in autumn, see how the whole event goes. Sounds interesting. 


Sunday, July 12, 2009

Wireless and junk DNAs

 It's very weird how putting things of my day to day life together in title makes it sound like some other whacked out story reminiscent of Haruki and his line of chic-absurdist fictions. 

I'm glad to say that after some hassle I got the wireless at my new apartment working at last. The contract for the rent actually comes with free broadband internet access, which I guess isn't too unusual in this day and age. But the thing is, the cable modem in use by the apartment I'm in is a piece of relic that came from when dial-up was still the king and people flocked to American Online services. The initial attempt at connecting the cable modem with a wireless router ended in the cable modem sending out a corrupted packet so arcane that it instantly screwed up the wireless card on my new-ish thinkpad (the roommate's Macbook and my Linux laptop were fine. Ugh, vista why do you suck so much?) to the point that I had to spend the next week figuring out how to get it to work. In the end I fixed it by deleting the device from the system panel and reinstalling it, which was something of a gamble, since according to the google it only fixed anything half the time with no-one knowing the actual reason behind the malfunction. 
After my laptop got back into networking-ready shape I stuck with ethernet cable connection for a while at home since I didn't want to risk frying my wifi card again on the poisonous packets sent out by the antediluvian cable modem. I strongly suspected some sort of Lovecraftian mystery filled with murder and hideous secrets behind the nature of the unassuming block of grey plastic and was content for a while living like someone from mid 90's.

Alas, the life chained down by ethernet cable in my own home grew too vapid for me. For someone who needs good access to computer almost 24/7 to pay the bills couch/bed/front porch computing is of huge importance for me. If I'm going to be stuck in front of the screen at least let me choose the location (as a sidenote I frequently work in the park even when I'm outside, the whole sunlight/fresh air around me when I'm working does wonders for productivity). So I decided to do some real research on how to get the modem to work nicely with my wireless router. 

Well, it turns out the problem was the age of the router itself. It's made from so long ago when wireless access to net was a precious opportunity for the rich and the cutting edge, it's not properly shielded from electromagnetic field of other appliances within one to three feet of its location. From there on the solution was simple. Use the ridiculously long ethernet cable I've been using for my laptop and place the modem and the router at opposite ends of the room. Funnily enough it worked and I'm writng this from my couch. I don't know whether to be happy or be infuriated by the hurdle I had to go through to get something as simple as encrypted wireless network running.

On the other note, I've been following the Dresden Codak webcomic since the days of its first inception. Even made an id on the forums, though I've only posted there a few times at best. Here's the newest comic at the site. There's something about the combination of the fantastic and the scientific in those webcomics that I find very charming. Yet unlike some other webcomics dresden codak still retains a sharp outlook on the reality that makes me wonder if the author is really drawing the future. 

I really like the main character. There's something about her that's very appealing to me on some basic level I can't quite explain. Maybe it's because she's an eccentric mad scientist. And as I have stated numerous times before, everyone at their hearts secretly long to become a mad scientist. I want to be able to stable some portions of my DNA on the fly as well, provided that I have much better understanding of it mechanism and quirks than is available to the academia at the moment. I also want to ponder the questions of the universe and work at solving it or at least understanding it instead of playing second fiddle to the real researchers on the cutting edge of the humanity's learning (and someday I might be able to achieve that, if I play my cards right). 

I'm just trying to join the ranks of the diy-mad scientists with my slow-but-steady research on viability of minimal cell system using cheap affordable tools. I really do think it's possible. The more I learn about it, the better the chances look, provided that I don't do anything too elaborate. I'm a little skeptical about the research tool potentials of the diybio minimal cell but then we'd actually need to have something on hand to decide that kind of thing, won't we?

Saturday, July 11, 2009

Crunchbang linux and Giant robots

I'm always on the search for the ideal operating system that would make my life much easier and actually get more work done instead of wasting my time. I guess this is my way of looking for the shiny new thing, just like how some people shop around for accessories and clothes, except that in my case it's all a bit more practical since I get to work in most of the oses out there (but then those people wear clothes to their work too...).

The world's operating system is roughly divided into two at the moment. Linux/UNIX derivatives and Microsoft Windows. In terms of general engineering and architecture MS Windows isn't really top of the line. I'd say Windows as a whole is more like an inexpensive housing and UNIX derivatives Gothic cathedrals (it's actually quite an accurate description of the codebase of both operating systems), except that the housing project costs and arm and a leg and the cathedrals are more or less given away for free, provided that you have the technical proficiency to configure and maintain the juggernaut. Of course, the analogy is a little off in modern days. MS really made strides on their operating system (partly because everyone loves to hate MS) and current vista/win 7 lineup is leagues better than what xp and 98 was, at least in terms of general architecture of the os. And some of the linux/UNIX based oses are really cleaning up their acts and becoming impressively user-friendly, though there are still aberrations like some of the more hardcore BSD oses that seem to view usability=os hazard... Am I alone in thinking that kind of behavior is reminiscent of the old-old times when they complained about schools because having too many people learn will devalue education? (Yep, they really said that.) 

Well, of the oses out there Ubuntu distribution more or less tops the chart in terms of user friendliness, along with OS X, which is really just a shiny shell on top of a BSD. Ubuntu is in turn based on Debian which isn't as user friendly but has remarkable stability and wonderful application depository meaning that whatever your needs are you'll probably be able to find and download it for free using synaptic or apt-get. The wonderful thing about such system is that you don't have to google for applications you need. You can just use synaptic and search-download on the spot, with the application being integrated cleanly into the os with rare need for post-installation configuration.

The problem with Ubuntu however, is their opensource-spirit inspired stance against anything proprietary, which is a big thing in this day and age. Take flash for example. It's everywhere and there are nebulous uses for it, ranging from online lectures to lab reports (ah the sad life of a sci student, first things to come to mind when thinking of flash isn't movies or youtube). On conventional Ubuntu distribution anything proprietary usually needs an extra step or two from the user to install them, unlike most modern oses of commercial flavor that tend to come with such necessities. Might not sound like a big deal but this is a deal breaker for a lot of non-hacker pc users out there. I mean, if you're going to advocate opensource to people at least provide them with a decent alternative. Don't just expect people not to use something just to safeguard some philosophical ideology. I'm sympathetic with opensource but some of the approaches being taken by its hardline proponents has a taste of cloistered conceit in them, which I think will hurt the movement in long run. There is a second problem in that Ubuntu is big. Of course, Ubuntu is probably one fifth the size of Vista even at its biggest, but when compared to other linux/UNIX distributions out there they just feel a little sluggish for some reason.

Well, I think I just found the ideal free operating system that has no qualms about providing its users with first rate experience regardless of whether the services are opensource or not. It provides basically all the essentials I can think of in an os, both proprietary and opensource, and it's much lighter compared to similar Ubuntu install. In fact, my fully tricked out installation of that os only clocks in at 1.2 GB of hard drive space (that's os+all the third party apps I installed off the net&depository) with 200MB of RAM usage when browsing with firefox at 5+ tabs open.  The installation was a snap too. It recognized all my hardware including the wifi and SD-card reader, and all the laptop-system built in shortcut keys are working out of the box.

They call this the #! linux, which is read as crunchbang linux. It's a modified version of the Ubuntu os I mentioned earlier., except much lighter and more responsive. Unlike most other lightweight distros out there #! has really great hardware support and even better application compatibility. With #! you have the whole of Ubuntu depository on your finger tips, it would practically be impossible not to be able to find an application you need.

If I ever go all out with linux on my primary machine, I think I'll choose #! without a moment's hesitation.

Oh by the way, in case you haven't been keeping up with the news, Japan's Tokyo built a full scale replica of the original Gundam in the Odaiba area.
 



It's full scale, meaning it's ridiculously huge. For some reason the original UC gundams are rather big with following series progressively introducing smaller, sleeker models.




The amount of detail on this replica is amazing. They actually worked out all the decals and directions for the fictional Earth Federation engineers for repairs/upgrades/tuneups etc. It's obvious they put in a lot of thought into the whole thing. They should really, I think this is for the 30th year celebration of the birth of Gundam.



Again, amazing detail. And the building in the background somehow fits in well with the mecha.




Here's a picture of it at twilight. Just too cool for words. As a side note the apartments in the background aren't really that expensive. In many cases such apartment towns are for middle class housing and common sight in many parts of North East Asia. Some of the newer apartment towns have little streams and parks between the buildings. It's surprising how some people consider the dilapidated and overpriced housing condition in and around NYC is the same in other parts of the world.




And Gundam at night, with the eyes lit up. From what I hear it doesn't move, but the very fact that they were able to build something like that in a major city itself is a big news. Why won't NYC do something fun like that once in a while?


By the way, all the picture are from the Flickr and the wonderful people who decided to upload their pictures.

Thursday, July 9, 2009

Life extension

 There are a lot of life extension enthusiasts out there these days. It's been progressively getting more and more media attention with advances in biotechnology and pharmacological sciences... The relatively recent mainstreaming of transhumanism really helped too. It's gotten to the point that there are full scale research institutions out there supported by prestigious universities and grants devoted to the research of the solution to the intriguing problem of death. Maybe it's driven by the life-like system's innate desire to live on. Maybe it's just a show of curiosity, an emotion that had never been too rational to begin with. 

Regardless of the validity of the concept of transhumanism in modern world, process of possibly halting senescence is an attractive prospect for the many. Only thing certain in life is death and people are looking toward scieces to possibly ward off that certainly, maybe forever. And there had been a lot of interesting developments in the area as well. Most of the research is being driven by new technologies of gene and protein discovery/extraction, and the latest discovery that came from a fungus of the Easter Islands might serve to demonstrate how some of the most intriguing researches in biological sciences might be made by simply studying the mechanisms inherent in one of the many variants of the living systems already on this planet as a product of years of evolution. 

The linked article is more about the effect of the chemical compound rapamycin on older mammals and its rather intriguing effect of prolonging lifespan of mammals from 9~13% even when administered late in life... I am not entirely too sure of the mechanism that allows the mammals to prolong their lifespan through injection/intake of the compound, since aging and effects of aging tend to be results of different innate and outer factors surrounding the organism. If it is possible to use a chemical compound to somewhat prolong natural lifespan of an animal even late in its life would it be possible that there is a definite central mechanism that runs the process of whole-body aging behind the show? Would it mean that it would be possible to prolong lifespans using such artificial compound therapy to prolong lifespan of human beings without resorting to a life time of controlled diet and full scale genetic modification? 

Rapamycin is apparently a compound that is already approved by the FDC and in use by the medical community for purposes of immunosuppressant therapy. Would that imply that the aging system built into mammals are somehow linked with the basic immune systems as well? Intake of rapamycin would lower the immune response of the subject, so there definitely is a chance that the subject will in fact die from infections despite increased base lifespan. It's something of a catch 22, and an interesting reminder of the folly of common conception in treating senescence as some kind of flaw or even a disease. If anyone is serious about artificially halting the processes of senescence we must consider the possibility that death might be a natural result of the kind of physical system we have for our body, just as our body requires us to eat and drink in order to sustain it for any significant lengths of time. 

Wednesday, July 8, 2009

google chrome os update

 The Google Chrome OS update
 
(here's another article with simple overview of what the Google Chrome OS will be about, so far. http://www.fastcompany.com/blog/kit-eaton/technomix/google-drops-bomb-its-own-operating-system)
 
There's no real new news on the state of the Google Chrome os (from here on referred to as the chrome os). But glossing over the last night's post on the chrome os told me that I didn't really write anything that contains useful information as well, so here's an update (don't blame me, I really needed to go to sleep).
 
The chrome os will apparently be running on top of (albeit heavily modified) linux kernel. I can only begin to envy the guys who are working on this stuff as part of their paying job... For now from whatever the scant information I can gather on the architecture of the future os points to the Chrome browser acting as a sort of front-end for the minimalistic operating system, though Google's description of that arrangement was more like running an operating system inside a browser, not the other way around. 
 
We should do well to remember that the Chrome browser is a different beast from most other browsers on the market out there with such elaborate processes like... Well, process separation for all the individual tabs within a session, with very likely support for multi-core processor rendering in the near future (an interesting tidbit: apparently someone in FireFox community suggested building separate processes mechanism into the FF browser way before the Chrome ever came out. He/she was more or less ignored of course. And as they say, the rest is history). I guess they are thinking of making every event within the os happen within the browser with each tab of the browser working as separate applications within the os? I can certainly visualize the idea in my head, but I don't know how to make it run well in practice... But then I'm not an operating systems engineer at Google.
 
The emphasis of the chrome os will be on four things. Simplicity, agility, security, and cloud. The first three seem to be the golden standard for operating systems these days. And they should be. People had been domesticated for overly large clunky operating systems that gets two times slower every year for a while now. And it's not even because of new functions. It's because of the legacy codes. The official Google blog post on the chrome os puts it best as 'not wanting to wait for the operating system to come up so they can use the browser.' Well in my case it's not just the browser, but I certainly sympathize. I have work to do, things to read and write. And I need to sit in front of my computer staring at a logon-splash-loading screen cycle every single day, and then wait for the operating system to calm itself down after the os finishes loading. After all that ritual is over I can finally begin to get some work done, though sometimes I need to wait for the antivirus/firewall program to stop acting up as well if I want a quiet, stutter free computing experience. Sure, the whole thing I just described takes about half to a full minute, and then another minute for operating system stabilization on most modern laptops. At most it takes about two minutes on a bad day, and less than a minute on a good day. Yet, most people don't have 'modern laptops.' Any operating system that performs 'ok' on moder hardware will probably slow any older hardware to a crawl. Since most people these days use their computer at least once every day they get used to such slowdown to the point that they don't even notice something's wrong with it. Only after running into a newer hardware or a different version of operating system do they realize how painfully slow their own systems are (which might have been one of the many factors that contributed to Apple's rise to stardom in the os arena). 
 
Google is stating that they will get rid of the useless middleman, or at least squash him into size of an invisible midget, with the new chrome os. They are promising instant-on functionality with smaller memory footprint and processor usage. They are promising a fully pledged operating system fit for a netbook. Granted, there are a lot of smaller linux flavors out there that measures in the megabytes with full GUI and applications suite, but Google is also promising the Google level of engineering, architectural innovation, and technical support. From what I read, they are shooting for an os that takes 'only a few seconds' from booting to getting work done, which would be unprecedented even among smaller linux distributions out there. The whole of the operating system will run in web browser with no need for things like desktops and docks. All the applications for the os will run within the browser windows (with, as stated before, each of the browser window being separate sessions) and those applications can range from wordprocessor, movie player, and mmorpgs, to a whole sale emulation of another operating system running off the cloud without requiring too much processing power from the client side computer. For someone who lives with the net all day such an os is a dream come true, especially when one of the primary ethos of the os is being secure and lightweight. 
 
Yet the cloud technology (writing about it makes me feel like I'm talking about the luminous ether) which is being considered as the enabler of the philosophy behind the google os might prove to be one of its biggest weaknesses... Or rather, I'm fine with it being a weakness of the os, but there are some people out there who fear that the cloud computing would actually do harm to the computing culture in general. Like Richard Stallman, and Cory Doctorow (to certain extent), both huge proponents of the liberal software movement most people refer to as opensource. I do think they hace a point there. Exclusively cloud based computing culture is not longer a technological movement, it's a consumer movement. As computer technology becomes more and more pervasive in human society (think MIT's oxygen project) they can't help but to become consumer oriented, ruled by the laws of economy rather than creativity... Unlike some other radical proponents of the freemarket (who views the principles as some sort of universal panacea) I view the possibility of consumer oriented computing culture as something with significant potential for harm, especially when it gives the tools for separating the consumers from their own machines to larger scale corporations capable of building and maintaining large scale server complexes required to support cloud computing on any significant level. Of course, the cloud computing scenario isn't all doom and gloom and portents of doom. It's fully possible that the server technology of the post cloud future would advance to such a point that any interested individual can run a 'cloud computing service' out of his or her garage, a private property where he can dictate his own terms. 
 
One thing chrome os can do to, or rather, must do to remedy such possible weakness of cloud based operating system would be including a development/scripting environment with the os. Since the chrome os is based in large part on the chrome web browser I'm guessing the browser+address bar interface can provide some type of shell access to the user. The command line-shell application can run in the browser windows itself when called through the address bar. I don't know about compiled languages but it should be relatively simple for the os to provide a scripting language support like python/ruby and lua... Android OS's recent inclusion of scripting environment that supports lua and python certainly is a good sign of things to come, even though both operating systems are said to be different from each other... The important thing for the chrome os would be to have great support and tools for letting users develop their own applications within the chrome os, instead of making them rely on other systems/pay-to-purchase tools/high end hardware dedicated to software development. 
 
The chrome os isn't set for release until the second half of 2010 on netbook platforms, although the official platformless release will come much before that. I'm guessing it's a move to optimize the os before it gets released to the wider audience. Surely we'll get to catch more glimpses of the os in development or perhaps in action soon. 
 
I'm sick and tired of oses that fail me all the time. I just can't wait for the chrome os to be released and bring something unique to the dull and constricted os marketplace. 

Tuesday, July 7, 2009

Google Chrome OS

 It's almost two in the morning now in the city. I was just about to wrap up my writing session with Mathematica/LaTeX when strange news of a new operating system came flooding in through the internet. The RSS reader registered a mention of Google Chrome OS from ArsTechnica while I received links to an article on NYTimes on the possibility of Google OS, and then the flood gates opened on friendfeed and twitter. Apparently the Google OS is something people had been secretly fantasizing about all these years, since it's currently trending at the top of twitter trend list recently occupied by MJ's death. Yes, MJ fans. A news of a computer operating system that's not set to be released for another whole year just beat out MJ's death news on twitter. Read it and weep (no disrespect to MJ, but it's about time though). 

There aren't any detailed news coming out of the Google regarding the OS at the moment. I'm risking disturbing my sleeping routine on a workday (yet again) to find more news about it, being an OS junkie I am (not good with computers, I just like Operating Systems for some odd reason). Yet no new news beyond the single blog post on the official Google blog and some smatterings of articles on the web, most of which point back to the official Google blog post. 

I don't know what to say. I'm probably one of the few zillion people who've been waiting for Google to get to work on some kind of major operating system for a while now. I'm something of a hobbyist of the Android platform already, and my primary web browser was set to Google Chrome right until Firefox released 3.5 and one of my favorite otaku web celebrity went some lengths to support it. I'm seriously considering switching back to Chrome now, since all my bookmarks are on delicious now I don't have to worry about switching browsers anymore. It's all available from the web through the horrible alchemy of cloud computing.

I am somewhat skeptical on the whole Google OS issue though. While I love the idea, and I certainly will install it on a new pc (will need to hunt down some old boxes I don't use anymore... Sadly I can't really experiment on my work laptop), pushing out a whole new operating system is kind of a big deal. And this OS is supposed to be based on Chrome, the web browser. Last OS to attempt something with such bravado was Windows OS (the whole of the operating system runs on i.explorer shell) and they really got burned for it. Although I must say it would have turned out a bit differently if the web browser Microsoft used as their operating system shell wasn't such a clunky mess... So would it mean that Google Chrome OS would stand a chance of becoming a classic operating system like the hollowed OS X and WinXP (I know, I know, but lasting for a decade as the defacto operating system of choice for the world needs something more than marketing savvy)? The Chrome web browser is certainly leagues better than what Internet Explorer was/is afterall.

There's one more thing I'm worried about in particular though. The Chrome web browser is something of a resource hog, though not nearly in same level as I.E. Chrome, perhaps due to some of its advanced features, can't run on older operating systems wherein modern Opera browser for example have no problem running on Win98. Also I do remember some people complaining about the responsiveness of the browser under anemic/archaic hardware settings. I still remember the day I found out that 5-tab strong Chrome session will slow my then-3 months old Thinkpad if I left it open and running overnight due to some issue in memory/processor management. The most recent update to the Chrome fixed most if not all of such conspicuous issues but I still have my doubts. This is especially alarming since the first iterations of the Google Chrome OS will be coming out for netbooks, which tend to have even weaker hardware than laptops, which falls behind desktops.

There's also the danger of how viable future market for netbooks might turn out to be. It's always possible that people decide they want cheap larger laptops afterall (which is probably the reason why some of the bigger companies still don't have a decent netbook line-up of their own. I'm looking at you, Sony. The jokes you put out on the marketplace shouldn't even be considered netbooks, though you do make some good laptops).

And of course, as with the Google Chrome/Chromium web browsers the whole operating system will be open source, I'm guessing in similar style as linux or BSDs. Which in itself should be enough to make people drool over themselves... Now that I think about it, my current netbook Asus 701 4G (the first commercial netbook that ever came out in the U.S.!) is about due for an upgrade. Maybe I should hold out for an year and get my hands on the first production units of the Google Chrome OS netbooks!  

I eagerly await for our future Google overlords.

Biolinks

Just some bio related links for people who might be interested.

The molecular biology webpage usually have really cool resources for biology scientists and enthusiasts alike. So it's a given that they would have something pertaining to perhaps the most widely used web browser among biology/biotech people out there.

This year's list of must have biotech extensions for firefox researchers have some useful research tools, though for some reason they neglected to mention some of the other cool extensions like the biofox and other in-browser biosequencing tools. There are operating systems out there that are geared toward scientific research, and it's about time we have something like that for the scientific community as well. Before someone gets the idea to build a fully-pledged research web browser firefox is the one that best fits the profile due to its extension system. 

They also have a website listing 10 best podcasts related to biology. I highly recommend the U.C Berkeley molecular biology course. They cover most of the basics in a very beginner friendly and comprehensive manner without skimping out on the details like some of the other beginner friendly academic podcasts out there.

I think the idea of academic research oriented web browser is something worth thinking about. Since the opensource webkit rendering engine's already out there most of the major works would involve figuring out the best services and layouts for academic research built into the browser (with extension support?) and optimizing the codes for lower memory usage... Huh, put in this way the project suddenly sounds a bit overwhelming.

Sunday, July 5, 2009

An excerpt-small hours of the morning

It's way past three in the morning and I still can't go to sleep for some reason. Maybe I'm suffering from an onset of insomnia? I do certainly feel rather tired, so why won't my brain stop spinning thoughts and go to sleep? I'm getting worried. I have early day tomorrow.
 
Anyway, might as well make the best of the waking time I have. Read an interesting article pointed through Doctorow's twitter post. It was an article about science fiction and abundance of practicable facts. I want to share an interesting quote I found in that article.
 
""
Or, more succinctly, in order to get the marketplace off its ass to solve the impossible, you have to just pull off the highly improbable and make sure everybody knows about it. Show it can be done, show how you did it, and watch the "marketplace" attack because you've made the "premise" "plausible."
""
 
Hopefully my blackberry-to-email-to-blog scheme is working out here, I'd hate to have the whole page messed up like some other times before.
 
I love that quote. Maybe it would have been better if I could find something cleaner and concise, but hey this is a post being written on a handset while waiting to fall asleep.
 
The best way to make people what you want to do, is to do the improbable. The best way to teach a subject to a person might also be to show them the possibility of the impossible. Between students who've witnessed first hand the possibility of an improbable exercise-composing working genetic circuit on blackboard through abstraction symbols-and students who think those things are still within the realms of science fiction or upscale laboratories, it is obvious who would be more enthusiastic about learning. Of course, when you get into it there's the whole issue of the student possessing certain degree of curiosity in the first place, but I won't get into it here and now.
 
I love the idea of always-on network access and unlimited internet on mobile devices. I can definitely understand how many Japanese people almost exclusively use their handsets for their online presence and blogging/photo sharing needs. Makes sense. If all you need to do is write up informal blogposts and post pictures of your favorite moments in life & keep digital connection with friends why bother getting even a laptop? All that is possible with your handset and it has added benefit of allowing you to compose your works while on the move. It means you can use that wasted times for something a little more productive. The modern society at large, despite its rapid pace, is surprisingly wasteful in terms of time management. Many people waste hours per day just to get where they need to be, and classes waste hours per day waiting for kids to sit down. Mobile technology offers people a chance at using those wasted moments to do something creative, whether it be writing a keitei novel, taking and posting interesting pictures and videos, or just plain reading... The apparent lack of interest in mobile technologies by large corporation in United States (at least before the iPhone came out) is difficult to understand, and just goes to show how complascent those people can get.
 
Now that I think about it, are current generation bluetooth-ready cellphones have the capability to connect to a printer? Now that would be awesome. The cellphone os can provide the basic framework for setting up a page and the user would simply type in their mobile, printing their product directly from the printer, whether it be a short story, important email, or pictures taken on the way you want in paper format for whatever the reason. It would certainly be possible to do lighter homeworks on mobiles.
 
If there is a problem with the mobile oriented internet society, it would be difficulty of content creation inherent to the platform. You can certainly write and compose/draw using the tools the corporations give you, but most often you can't go beyond that. Japanese keitei culture, while impressive, is also one of the most oppressive I can think of in terms of software/hardware freedom (what makes it even more impressive is that the contents generated by the harshly locked-in culture is even more impressive than supposedly 'freer' U.S. market. What does that imply?). It's more or less out of the question to be able to run scripts or program things on your mobile. Most often applications available for purchase for your mobile is locked in by carrier with no opportunity for transfer, and all the rights to the application is owned by the carrier. It would be preposterous to demand sourcecode to anything. It's almost as if the customers of the carrier companies are spending machines that print money for the corporations within tightly controlled ecosystem (don't be too hard on those Japanese carriers though. U.S. Carriers are more or less the same, and some others are even worse, it runs like communism with cellphone carriers sitting at the center). Cory Doctorow once tweeted that such locked-down nature of mobiles make it unlikely to be a suitable communication/computing platform in third world nations. It's a good point.
 
And that is precisely why we need to work toward open specification open cellphone systems. Google's Android is a good first step, integrating relative platform freedom with user ability to write and run scripts like python/lua on their own machines. Yet I believe it would be unrealistic to count on developing nations getting their hands on those multi-hundreds of dollars gadgets that need to be recharged every day. In realistic terms, deploying current android based handsets in developing nations would be forcing many families to choose between a half-year's schooling for their kids and a fancy handset. At least OLPC was an education platform. A mobile is something that might go beyond that. Any realistic deployment of mobile must be based on commercial viability of the nation's people to afford that piece of technology, even should the telecommunications infrastructure for the mobile is subsidized by their local governments.
 
It means that we need a new strategy for future-proof mobile deployment in developing worlds. Something simple. Something that DOESN'T HAVE TOUCH SCREEN. It might not even need touch screen, just something monochrome that can be visible during the night. Something that does not need PC sync. Something that last for days on a single charge like any decent business phone. Something cheap with flexible enough OS that user with enough technical knowledge can program/script it from within. Something cheap and reliable and DRM free so people all over the world can knock off their own versions like they did with AK-47 from old Soviet Union, except that these handsets save lives and businesses instead of ruining them.
 
That would be the mobile to bring the rest (read:majority) of the world into the wired future. It might even be the basic framework to build our own future on. iPhones and Google Ions, remarkable devices they are, just don't cut it when we begin talking about the future.
Sent via BlackBerry by AT&T

Posted via email from bookhling's posterous

Friday, July 3, 2009

Mnemosyne and personal commuters

I'll get the personal commuter bit over with first.
The modern American society is built around the culture of suburbanites. While I've spent practically all my life in urban centers with high rises in America and elsewhere, there's no doubting that the very culture and technology that runs what we now refer to as American, as opposed to Japanese or European, runs around the concept and execution of suburban living centers. Indeed, in this country life-long urbanites like myself living in nyc would be more of an aberration. And one of the major factors that makes such suburban based social system even possible in the first place is the personal mobility offered by presence of cars. Lots and lots of cars. Most of my friends in city areas don't own a car and frequently don't feel the necessity to own a car, since using public transportation system + some assortment of short-term rental car service almost always work out cheaper and easier than the endless battle with cars, insurance, and parking space. Yet it would be suicidal for any suburbanites who compose the vast majority of the population of the United States to not to own a car unless he or she expects to walk on highways with groceries and school children in tow.




Apparently this is T3's new model of personal commuter vehicles, designed to ship people to their jobs within smaller urban and suburban settings. I'm dearly hoping for something like this to hit the marketplace soon, It's three wheeled design is certainly underpowered in any rugged terrain, rugged in this sense meaning any terrain with above average elevation, covered or not. Yet there are convincing evidences that suggest that such vehicle support design is actually much more fuel efficient and mobile compared to the traditional four wheeled drives. No matter how you look at it this is a kind of vehicle designed for simple grocery pickups and short range commutes, a perfect fit for anyone who needs a simple mobility system that's more powerful than a segway but smaller footprint than a fully pledged car. The design leaves much to be desired (personally) but I can imagine something like this practically flying out of the dealerships in droves around price-conscious middle class income area. I can also imagine many of the developing world markets flocking after a vehicle like this, especially if the nations get around to supporting international emission standards through some sort of tax break to manufacturers of fuel-efficient vehicles like this.

This makes me think, it's surprising how we still take mobility for granted. Even in this age of internet shopping malls and semi-omnipresent network, the world comes to a standstill without a real mobility solution in place, something that can get people and things from one place to the other fast and cheaply. And it's all the more surprising how we still don't seem to have made any major breakthrough in the area of physical transportation since the heyday of Ford. Almost makes me feel that the world had been standing still in certain aspects of technology necessary for the future.




This is a new 'luxury' usb based memory storage device named Mnemosyne, after the Greek goddess of memory. It's ridiculously expensive for what it offers. 16GB of storage within a 3D jigsaw puzzle container made out of aluminum (what's with element Al and people these days? Everyone's building something with it. Did mineral prices suddenly drop or something?) costs about 7k euros or pounds, I can't recall which but in either case the drive is crazy expensive. As any computing enthusiast should know, running industrial strength multi-terabyte hard drive array won't cost half as much, though it probably won't look as good.

Personally, I don't think there's anything wrong with charging a premium for design. Someone actually need to work to make a good design. You see, there's this prevailing myth in modern age where people for some reason equate good design with good decoration. They are different. There are some zen buddhist temples out there that's been standing for hundreds of years. They are very simplistic with almost no variation of colors with thick beams and practically no decoration in any traditional sense. It's a huge building that exists as a pure manifestation of geometry in corporeal world, a place of meditation where the Platonic ideas meet with the changing nature creating a timeless void of contemplation. In my mind such temples represent the very pinnacle of what 'good design' means. Not just in their simplicity but in that the architect used that simplicity as a tool to create a piece of space where the human will manifests along with physical reality. There are some late-Gothic cathedrals out there that also embodies the very same principles through the irrational exuberance of decorations, the opposite of the zen temple architecture but nevertheless achieving the same goal of good design, creating a timeless yet dynamic space where the human spirit manifests. 

And ridiculous prize aside I think I like what the designer of this 'memory device' was going for. Solving through tangles of timelessness into the very core that contains the memories the owner deems most essential to his or her being. I just wish someone who'd buy that would actually have the mental capacity to meditate on their experience, but for some reason I remain skeptical on that regard.

Thursday, July 2, 2009

Nethernet and Mr.Brain

The title to today's post sounds a lot like a title to some quirky Japanese NT novel (really, maybe I should write one). I love the NT/light novel scene in Japan. While vast majority of the works being printed out are of rather dubious quality (to put it mildly), the scene is a sort of youth oriented counterculture to the traditional literature scene which is also as stale as it can get despite the significant volume of works being put out to marketplace. It gives many of the genre writers certain degree of freedom in choice of characters and scenario, which often ends up producing easy-to-read works of Borges-ish fantasy coupled with an eye for modern trends definitely influenced by Murakami Haruki's style. It means fantasy novels that are called such because they are removed from conventional reality, not because they follow 'conventions of fantasy' like it is in the American publishing industry. The first time I looked through the American fantasy novel market after arriving from Korea/Japan cultural sphere was certainly a memorable moment in a very bad way. Hundreds, if not thousands of novels set in roughly the similar worldscape, so similar that they could easily interchange characters and settings between each other and it still wouldn't feel out of place. I felt as if I was standing in middle of a desert, a desert of dry, parched books filled with apathetic heroes all cloned from the same gene pool. And this was back in the late 90's, when Japanese mangas and animes were still a very underground thing, with major T.V. stations airing any kind of anime being some kind of wet dream. It is during those days that I gave up on fantasy entirely and instead turned to fantasy-like literature novels of European and South American variety, namely Neil Gaiman, Michael Ende and Jorge Luis Borges. I can't believe anyone would choose to read those dry fantasy novels when they can choose from any of the three authors I linked to above, and get lost in the true epitomes of human imagination, fantasy as it was meant to be. But I digress.

Nethernet in the title of the post is in fact a name of a weird mmorpg-like game that you play through an add-on toolbar on top of your firefox browser. It's a weird game where the majority of the action takes place in form of exploration of webpages, with different character classes capable of laying traps or countertraps through various websites, or leaving portals and lamp-posts forming missions and stories by piecing together the random components of the web into a structured whole. Some of the missions created by the Pathmaker class include 'the most amazing temples in the world', 'the list of unusual inventions', 'Jabberwocky' and 'brief history of the tubes.' In many ways the game reminds me of everything2,  a predecessor service to our current wiki-dominated landscape, though they were much more whimsier than some of the wikipedians who seem to think they know everything (hint: they don't). All in all, a very interesting experience, and I can see it's utility as a very useful learning tool for kids and adults alike. I also love the steampunk inspired design of the characters and terms within the game. For someone like me who doesn't really have time to devote to a full-length game anymore the nethernet certainly provides an interesting alternative.

I've always been partial to j-dramas due to their quirkiness. I think it's some sort of side-effect of having real actors imitate manga or anime situations and premises, but despite some corny moments here and there the experience tend to have some weird, addictive joy to it here and there. My interest in j drama was recently rekindled by my cousin's visit a while ago. She's a serious j drama nut, and she brought me a gift of dvd box set of a series called Nodame Cantabile. It's just as quirky as the rest and serious fun. I love classical music and comedy, the show has both. You would have to be one pretentious piece of work to not to enjoy it.

Well I've run across another j drama on the web that I plan on following through. It's about a neurologist working in Tokyo metropolitan science department's criminal sciences laboratory, a setting obviously influenced by the likes of CSI, with the important difference being that this show doesn't really take itself too seriously (just like vast majority of j drama out there). I've only seen the first episode so far so I don't really know what to make of it right now, but if this show's only half as good as Nodame I'm good. It's really refreshing to see drama series that doesn't take itself too seriously. So many people these days seem to suffer from the disease of pretentious philosophizing without any depth, you might as well get some laugh out of it.