1.27.2007

Vista Pricing, Woah!

As a Mac user, I would be more than prepared to buy the next version of OSX because I believe that it is worth the $69 (or less if you buy the family pack). I am a student so I get special pricing. Normally it is $129. Okay, so lets see, that means it is $129/$69 for the best version of OSX for the consumer. Obviously, there is a server variation but it is clearly meant for servers, just as there is a server version of Windows.

To compare this with Vista, we see that there are 7 versions, just for the consumer! Not only that, but the least expensive, stripped bare, more worthless than XP Home, upgrade only version costs more ($99, for an upgrade!) than I would have to pay for the next version of OSX. Okay, so MS has not announced any student pricing and they also have their special software distributions for some universities. However, this does not include teachers at all or students not in college, as well as smaller institutions.

This is certainly a problem, though I am sure that Vista will sell far beyond any previous version of windows, not because it is great, innovative, or worth the money in any way. It will sell because MS has fantastic, hegemonic marketing. Their marketing is so good that people truly believe that they have to have Vista.

All I have to say is NO! I will NOT pay for this!

First Impressions of Parallels+Vista

So, I found out that my school has a program for CS majors that allows us to download the business class version of Vista. As a web developer (not designer) I thought it would be prudent of me to install Vista under Parallels (build 3120). Over the next couple of days/weeks/arbitrary-measures-of-time I'll be writing my thoughts and experiences/difficulties working with Vista. Just to make it interesting, I'll throw in the experiences of my roommate who is running the same install of Vista on his desktop machine (WoW just crashed his XP install...sooo).

The Install


The installation was easy under this build of Parallels. They have a tool that asks for your name/company and the serial code for Vista and takes this information, as well as the inherent networking setup, and just skips that step for you. The only thing left to do is wait for Vista to run through and do its thing. Come back in 45 minutes to an hour and you should have a running install.

Notes: I allocated 16gb of hdd space in an expanding file and 1gb of system memory. I have the Core 2 Duo Macbook Pro running at 2.16ghz

My roommate did not have such an easy time. HIs install took at least an hour and half, not including driver, etc. setup time. More on that later.

Notes: Roommate has a P4 2.4ghz processor, 1.5gb of ram and an ~40gb Vista partition.

First Impressions


Despite having some difficulty with video and sound drivers, my roommate was actually able to run Vista with all the 3D bells and whistles. Transparency and 3D transitions (like when a new window "pops" up, and they do all too frequently).

In my experience, I did not have to fiddle with any drivers, though I did not get the benefit of the proper 3D graphics and acceleration. Hopefully, either Parallels will include the correct drivers or ATI (in my case) will come out with drivers.

Coherence Mode


Amazing! I had not seen coherence work properly on a previous XP installation and only tried the button as an afterthought. Not only does it fullscreen the window and drop the background, so that the start bar from windows appears at the bottom of the screen, but, it also skips the sometimes annoying cube rotation. Granted, I like the idea of cube rotation, but it never works very smoothly when combined with a resolution change.

Wrap Up and Miscellaneous Items


My roommate is currently having an impossible time importing his music collection into Windows Media Player 11. It will play and add a song to the library that is clicked individual, but, outright refuses to scan a folder, drive, or anything else for the massive amounts of music he has. I couldn't imagine that something so basic would not work. I'm interested to see how he solves it.

1.20.2007

UUID in /etc/fstab

I was confused the other day, while working to move my ubuntu installation to LVM, about drive entries in /etc/fstab. Instead of the standard /dev/hdx# format, the listings were using a string of random characters labeled UUID. Upon further inspection it would seem that this is a change brought about by udev in more recent linux distributions. To discover the id of a volume from its device name try reading the man page for "vol_id". In short, as root (sudo!), execute `sudo vol_id /dev/sdb1`. In my case sdb1 is the second sata drive. If you use IDE it would likely be hdx#, where x is the drive letter (a for first, b for second) and the partition on that drive. You can get drive number listings by using `sudo fdisk -l` (a lowercase L for "list".)

1.19.2007

AuthShadow

The apache module, mod-auth-shadow, for verifying passwords from /etc/shadow is only useable in its most recent form with apache. More importantly, it does not work with apache2.

/etc/shadow is the ancestor of /etc/passwd. It has evolved by keeping the password hashes in a file only accessible to the proper users. AuthShadow is important because the alternative is using auth-pam, which requires direct access to /etc/shadow. This means that the apache user must have read access to /etc/shadow and that is a very dangerous proposition. Hypothetically, if your server were compromised the attacker would then have your password file for cracking at their leisure. AuthShadow solves this by using an intermediary script that is executable by the apache user but owned by root.

However, there is an rpm build available for apache2 (which does not work for me using debian/ubuntu, but alien solves that issue). It is available from rpmfind.net. The problem here is that it is not the latest version of AuthShadow (2.1). This leads to issues because of a uid verification bug in the intermediate script. The latest version that avoids this bug is 2.0.54. The error from syslog or system.log is "validate: FAILED VALIDATE: caller uid mismatch, must be 65535 not 33" or something to that effect.

To solve this issue I downloaded the source for 2.1 from their sourceforge page and used the included make file to do a "sudo make validate" and replace the validate script in /usr/sbin/.

Once this works it can be used in sites-available/configuration instead of AuthUserFile (a file made with htpasswd.) In addition, any user that has an account on the server can now be validated with "require valid-user" or "require user username". This is fantastic in a combination with web_dav and/or subversion repos. For subversion, you would probably want to research the use of Authz to control directory by directory permissions for subversion repositories.

1.14.2007

Painfully White

My friend and all around crime-fighting partner, Knuckles/CFICARE, has a new blog up. It is entitled Painfully White, oddly enough.

What For?

Anyone reading this blog may be wondering why I picked the name CodeCollaborative. Well, that would be the name of my new project and I'll just leave the rest of the guessing up to you. That's all you get for now!

1.13.2007

Creating VirtualHosts on a Local Server for Testing

After an exhaustive search marathon, for lack of a better word, on google, I have masterminded a way in which it is possible to test a website (built on php, in my case) as if it were actually deployed at its own DocumentRoot. Basic knowledge of Apache configuration is required (assumed since you're actually reading on.)

Subversion control is fantastic and I love running my own Apache server on my Mac. My problem was experienced when I updated between my computer (where the files are a sub-directory of DocumentRoot) and the actual deployed server (where the files are in the DocumentRoot.) For the purpose of including files with their proper paths these files would have to be adjusted each time I updated. Each file would have to be altered, individually, effectively breaking the paths when updated on the other server.

Now, this may not seem like rocket science to any person with experience working on deployed websites, but, there are a few steps that have not yet been grouped together. Specifically, if your own test server is operating simply on a local network or machine and if you don't have a domain pointing to this server (in which case DNS would take care of resolving a name for Apache to pick up on) simply adding an extra VirtualHost does not work.

The Solution: on most *nix systems you can modify your own /etc/hosts (try using sudo, not su) and add an entry in the same pattern as localhost with a name of your choosing. In my case the entry was 127.0.0.1 thanatos. Now, in your apache configuration files httpd.conf or sites-available/sites.conf add the line ServerName name-from-hosts to the VirtualHost section that already exists. Repeat this process from /etc/hosts to .conf file, this time using a "subdomain", still pointing at 127.0.0.1, for each website you wish to have it's own DocumentRoot. From now on, add a new section for each "subdomain" and configure its DocumentRoot to be the directory you wish to use with that site and the ServerName being the entry you added to /etc/hosts.

When finished entering all the new VirtualHosts, make sure that the site configuration is enabled. In apache2 just check sites-enabled/ (same path as sites-available) and see if the .conf file is there. If it is not, try using a2ensite, if it is available, if not link it using ln -s. Now, RESTART APACHE!

The Result: when you navigate to your subdomain.localhost (or the like) your sub-directory will be in it's own root and all includes, links, etc. will be accessible as if it were alone.