Monday, November 17, 2008

Cisco VPN Client for Linux!!!!

Hallelujah!

http://projects.tuxx-home.at/?id=cisco_vpn_client

I installed an Ubuntu 8.10 VMWare appliance at home on my Mac Pro and needed to connect to work. I wanted my VM to connect rather than OS X so I looked for a VPN client. Cisco didn't permit me to use their client so I found the one I'm using now at the URL above. My VMWare appliance is 32-bit so I tried installing an older, 32-bit Cisco client from that site, couldn't make it work, installed a bunch of patches, etc. Finally, I just tried installing the latest 64-bit version and it worked! I'm very pleased. Time to stay up even later working.

Thanks, not-Cisco!

Friday, July 18, 2008

SSHFS from ext3 to NTFS woes

This is a follow-up to http://blog.alijnabavi.info/2008/07/difficult-experience-trying-to-create.html.

I thought I eliminated all of my problems yet there remained more problems: bytes of some of the files were zeroed out. It took me a while to figure this out, unfortunately. Finally, though, here is the config that seems to work (in my .bash_profile):

echo 'me_password' | sshfs me_username@www.me.domain.com: ~/mnt/www.me.domain.com -o password_stdin -o reconnect -o allow_root -o workaround=all -o ro -o umask=222 -o use_ino


Some of that is probably unnecessary and uninterested, but what solved my problem was the "-o use_ino".

I didn't want to blindly plug something in without understanding what it was doing . . . HOWEVER :-) I am not ready to learn about use_ino apparently because the results page that Google showed to me when I searched for it looked entirely unappealing, especially at 00:35 when I have to get up at 06:00. I'm just happy that my files are intact.

Friday, July 11, 2008

Difficult experience trying to create decent Web environment

Goals:
  1. Put service configurations (Apache, JRun, etc.) from production Web servers into Subversion repositories.
  2. Duplicate the Subversion repositories at a remote location for data security purposes.
These are pretty modest goals, as I'm sure anyone would agree. However, there is a litany of obstacles.

Circumstances:
  1. I don't have shell access to the Web server OS.
  2. The OS is Windows 2000.
  3. I can RDC into the boxes but the only thing I can do is shut down or restart the machine.
  4. I have SFTP access.
  5. The SFTP server is VShell and appears to behave significantly differently than OpenSSH.
  6. The "server administrators" are unwilling to install a Subversion client on the remote machines. (Of course, since I can't access the OS anyway then I couldn't operate a SVN client even if they were to deign to install one.)
I decided that a solution would be to mirror the remote filesystem on my local machine, in a working copy, and then commit the changes to a Subversion repository on my machine. I would then regularly dump the repository to a remote server via SSH.

That turned out to be tricky. I used lftp (which is awesome) for mirroring the remote filesystems to my local machine but I felt like the filtering/matching mechanisms were not flexible enough for my purposes. I ended up using lftp because using rsync over SSH to a Windows OS running Van Dyke VShell SFTP server does not work. The reason it does not work is because the "server engineers" disabled SSH on VShell. lftp operates over SFTP, as well as a zillion other protocols, so it worked perfectly.

That happened over the course of several months and was reasonably complete a few months ago. A couple weeks ago, I ended up using fuse and sshfs to mount all of the remote filesystems locally so I could use bash and GNU utilities on them. It dawned on me that this might allow me to use my local svn on the locally mounted remote Web servers!!! None of that mirroring crap!

Oh, were it so simple. I realized that there is a problem with renaming file with svn over sshfs and I just could not get it to work properly, even with the "-o workaround=all" option to sshfs. I thought, "Okay, whatever, I can try git." I recently started using git for some local development and thought that perhaps the way it uses the filesystem would not be impacted by the POSIX versus SFTP renaming problem. WELL . . . it was impacted alright. Of course. That's when I realized that since the filesystems are locally mounted that might allow me to try using rsync again!

rsync worked almost perfectly right away. The only thing was I discovered some corruption in the form of incomplete files. For example, I opened a file over sshfs with vi and the end of the file was truncated and filled with garbage characters. I opened the same file with jEdit and saw the same thing. I opened the same file with jEdit *over sftp rather than sshfs* and there was no corruption. I tried different sshfs and fuse options but it was when I disabled caching, "-o cache=no" that the corruption disappeared.

I just used some rsync filters to protect the .svn directories from deletion and the mirroring went fine. Previously, with lftp, I had to do some pretty convoluted things in bash to ensure that the local .svn directories were not deleted during mirroring.

So, it finally works, despite the best efforts of the "server engineers" and "server administrators" and "IT professionals".

I'm a Red Hat Certified Engineer and as of July 12, 2008 I'm still looking for a professional environment in which to work. Feel free to offer me a position with your organization that embraces best practices! :-)

Wednesday, May 14, 2008

Helpful compiz links

http://forum.compiz-fusion.org/showthread.php?t=5483

http://www.nvnews.net/vbulletin/showthread.php?t=102570

Monday, January 14, 2008

I AM RHCT

I passed the Red Hat Certified Technician exam last Friday. Yippee! I don't want to brag but I aced it. :-)

I will take the RHCE exam on Feburary 15 and will try to ace that one also.

Installing Oracle Calendar on Fedora GNU/Linux

The whole reason I was inspired to create this blog is that I installed Oracle Calendar client on a Linux box for the third time and decided to document the process since I'm sure I'll have to do it again.

Now, unfortunately, I'm at home and can't remember what the steps were. :-) I'll try to remember to do it tomorrow.

[time passes . . .]

Okay, it's tomorrow.

Including a compile time directive that I found online, here's what I did:

cp cal_linux cal_linux.bak
cat cal_linux.bak | sed "s/export LD_ASSUME_KERNEL/#xport LD_ASSUME_KERNEL/" > cal_linux
yum whatprovides libXp.so.6
yum install libXp.i386
./gui_install.sh
exit
ocal
yum whatprovides libstdc++.so.6
yum whatprovides libstdc++.so.5
yum install compat-libstdc++-33.i386
ocal


:-) Some of that was unnecessary. The necessary parts are:

cp cal_linux cal_linux.bak
cat cal_linux.bak | sed "s/export LD_ASSUME_KERNEL/#xport LD_ASSUME_KERNEL/" > cal_linux
yum install libXp.i386
yum install compat-libstdc++-33.i386


I installed it in /opt and created a symbolic link to "ocal" in "/usr/local/bin".


ln -s /usr/local/bin/ocal /opt/OracleCalendar/bin/ocal.sh

New blog!

Yes, can't have have just one blog, must have many, many, many of them.

I deactivated my Geeklog at alijnabavi.info because I just wasn't maintaining the back end the way I felt I should and was sure I was going to suffer some attack at some point because I didn't install a security update. I definitely got a lot of comment spam and was sick of cleaning it out.

THEN. . . .

Then I deactivated my Geeklog at alijnabavi.info (I know I already said that) and moved my stuff over to Google. Now Google hosts my mail and several subdomains. Now they will (indirectly) host my bloggingz!!!!!11 Outstanding!

I was primarily interested in making my resume available so I put that up but have not done anything with the data that is in my Geeklog. If I ever run out of things to do (not likely) then I will post it in some form, maybe as a text dump right out of MySQL.

Okay, that's it. Now to change my DNS.

Excelsior!