Archive for the 'software' Category

Computer Trouble.

January 3, 2009

Last night I tried to boot my flagship computer and had some weird trouble.  First, I got an overclock fail and then it just wouldn’t go completely into XP.  It also seemed very, very slow.  I tried resetting the factory defaults in CMOS, but that had the very undesirable result of knocking out the RAID zero boot disk.

So even though it’s back to factory defaults, I had to convince it to pick the RAID back up without trashing the windows partition.  This thing is old enough where you have to make a floppy to put the drivers on the system.  Around midnight, I remembered that there was a setting in the BIOS necessary for the RAID.  I flipped that back on and got it to boot (but very, very slowly).  Because the slowness starts with the post screen, I do not think this is a software problem.  I might try to flash the bios.  I also see that Norton has a program called Ghost.  I’ve never messed with imaging programs because I’ve always kept my data on a separate disk.  I’ve always felt that since I had program disks, I didn’t need to worry about programs.  Of course, now I’ve accumulated so much crap, I don’t even know what I have.  Plus, I’ve got a lot of codecs on this machine.  So, I want to try to do an image.

Something similar happened to two AMD boards that I had years ago before they died.  I fear this computer’s time grows short.  I detailed its creation on this blog.  You can see that it’s lasted a scant 2 1/2 years.  This is not typical for a computer that I build.  I usually get 3 to 5 years without any problems.  I think most would agree that it’s probably the overclocking.

Of course now, I know a lot more about overclocking and I know how far you can push components.  The price of my knowledge has been this computer’s potentially premature death.  I’ve been joking that this is just an excuse for me to build the I7 rig that I’ve been thinking about (and maybe I will in the next few weeks) but for now, I’m mourning the decline of something I created.

Becoming an Adobe fanboy.

December 27, 2008

On of my goals over my holiday vacation was to learn Dreamweaver, Photoshop and Flash.  Later I might learn more about Premiere but I’m told that Vegas Pro might be the way to go.  I’ve been sick, so I haven’t made the progress that I’ve sought, but I have made inroads on the Dreamweaver book.

I want to take the blog that I do for work out of free WordPress hosting and put it into an environment where I pay to host it but where I also have more control.  I probably don’t need to learn Dreamweaver to do this, but I think it may be a good start.

Also, learning new software causes me to make mental connections that help in ways that I have never considered.  I think that this whole software learning project is going to take several months now.  If I decide to buy the software after playing with it for a while, it’s going to be expensive.  I want to be absolutely sure that I’m going to use these tools.

You might ask why I don’t stick to open source.  Certainly, there are some alternatives.  Everyone knows about Gimp, for example, instead of Photoshop.  I’ve been using Gimp for a while for quick photo editing.  I want to get past superficial editing, though and really learn about the layering aspects that Photoshop offers.  I think that’s going to be the most fun part of the project (along with video editing).

Doing this for fun is also a real motivator.  Just starting the Dreamweaver book has made me think of some unrelated server projects that might be interesting.  So there’s real value in this and it could be engrossing too.

Building a PC: Tall or Wide?

September 6, 2008

As I see it, anyone who wants to build a computer today needs to make a choice:  should it be built to be as fast as possible without regard to the number of cores or should it have as many cores as possible without clockspeed being a priority?  In other words, should it be tall or wide?

If you want to go tall, it seems to me that you should buy the most expensive dual core chip you can find and then overclock it as far as it will go.  If you could get a dual core up to 4.6 GHz you would have a fast machine indeed.  Couple this with fast memory and a Raptor RAID 0 set up or even a flash drive and you have raw speed.  Of course with only 2 cores, it’s not wide.  But do you need it to be wide?

On the other hand, if you buy a Skulltrail motherboard and put 2 quadcore chips on it, then you’ve got something that’s wide.  You have 8 cores!  Even with overclocking, you probably won’t be able to get to the same speed as the overclocked dual cores.  However, if the software you’re using can handle multicore processing, this computer with smoke the dual core system using that application.

But there aren’t that many programs that can use quadcore chips effectively.  Today, it would seem that you’re better off with a fast dual core than spending crazy money on a quad or dual quad system.  This won’t always be the case, but for probably the next 6 months.  It’s true.

Building a 32 TB Server: a thought experiment.

August 31, 2008

Daniel Gimpelevich and Holden Aust built a 16 TB server for Christian Einfeld and his Digital Tipping Point Project.  See Linux Journal, Issue 173, September 2008.  I am impressed that these gentlemen built a server with four times the capacity of anything that I have ever attempted.  It’s funny because in Einfeld’s article he mentions it almost in passing.  My jaw was on the floor.  Also I think it’s cool that he’s a lawyer who is also very much into technology.  Moreover, his philanthropic efforts in San Francisco are admirable.

The server they built motivates me to try to build a 32 TB server.  There are three problems that I have not worked out.  One: fitting 16 drives in one box.  I would wait to build the server when 2 TB drives are obtainable.  I am assuming that I can find a case somewhere that will hold 16 drives.  If I can’t, I would have to have some sort of an external enclosure and run SATA cables to it.  Two: I don’t know if FreeNAS can handle 32 TB of storage.  If not, I’d have to use some other platform, but I suspect it could do it or could be made to do it.  Three: I don’t know if you can put three or four SATA cards on one motherboard.  Obviously these gentlemen figure that part out.  It must be possible, I just don’t know how to do it.  It may be as easy as plugging them in.

If I were to succeed, a RAID 5 FreeNAS server would provide 20.8 TB of usable space out of the 32 TB available.  Since you have to do backups anyway, it almost makes sense to have two raid zero 32 TB servers as you would get 27.73 TB of usable space each and faster performance.

At this point, this is just a thought experiment.  In terms of money, when the drives become available, we’re not talking about that much compared to other types of extreme computing.  For example some people will spend in excess of 14 or $15,000 buying an overclocked “ultimate” machine.  A 32 TB server would probably only cost $2-$3000 to build.

I have to admit it is exciting.  I don’t know what I would use it for.  I still have plenty of space on my 4 TB server that only has 2.6 TB of usable space.  Even with an HD TiVo and pulling HD content off of it and putting it on the server, I don’t think I would need anything close to 32 TB of space.  But it would be fun to build.

FreeNAS Idiosyncracies.

July 27, 2008

I enjoy using FreeNAS. I have three FreeNAS servers. Why would anyone need three? It’s because of the idiosyncrasies associated with using the software. For one thing, Vista doesn’t like FreeNAS very much in certain configurations and I have 3 Vista machines. For example, if you set up one of your servers in a RAID 5 configuration, Vista will read it just fine. But it won’t write to it. It won’t write to it because it thinks the disk is full. Vista can’t get an accurate reading of the disk size. So it won’t write to it at all. XP does not have this problem, thankfully. If it did, I would not be able to use FreeNAS in a RAID 5 configuration.

I overcome this idiosyncrasy by having another server set up with its disks mirrored. Vista reads disk mirrors without any trouble. All I have to do is use an XP machine to sync up the data from the disk mirrors with the main server. This way I have all my data on one machine. I use the third server to back up the main server. In a way, it seems ridiculous. However, FreeNAS servers are cheap (the software is free). Moreover on a gigabit network, data transfer is fast. You could accomplish what I do with a Drobo. However, it wouldn’t be nearly as fast. Also I’m not sure that you could stream video from a Drobo. FreeNAS servers make excellent video servers.

I started out using Rsync to keep my servers synced with each other. This quickly crashed. I couldn’t figure out how to fix it either. So, I moved to Allway Sync. This program has worked wonderfully. Of course FreeNAS wouldn’t be FreeNAS with out idiosyncrasy here as well. You have to reboot the servers more than you should. Often, after transferring gigabytes of data, the server will drop out and need to be rebooted. I haven’t lost any data and the servers are fine once you reboot them. Sometimes, with Allway Sync, sinking directionally as opposed to bidirectionally works better.

The latest FreeNAS joy has been having two of the servers spontaneously switch IP addresses. I have no idea why this happened. I thought for a moment that maybe there was some malice involved from a third party. But no. My best guess is that IP addresses opened up and that the servers rebooted and picked those. I’ve been using FreeNAS for a long time and it has never done that.

You might think that is a lot of effort to use these servers. But I can tell you that that is not so. While I have identified the above issues, they are all manageable. The servers have great up time and I can move my data around quickly.  Plus, it is awesome to have all my data in one place.  It’s like having an old car that has problems but you know what the problems are and you know how to fix them. And in the meantime the car gets you where you want to go. With all the redundancy I have with these servers, I believe my data is safer than it has ever been (knock on wood).

Weizenbaum and Artificial Intelligence.

March 15, 2008

In the Wall Street Journal of all places, I read that Joseph Weizenbaum had died. He created the ELIZA computer program that simulates human interaction. While not a sophisticated program, it is always mentioned (and always will be mentioned) in discussions of artificial intelligence and Turing tests. With his simple program, Weizenbaum immortalized himself as an AI pioneer.

Even though artificial intelligence has its critics, it is already ubiquitous. Just today, I called UPS and spoke at length with a computer. It used voice recognition technology quite effectively to identify my package number. Even now, I am writing this with NaturallySpeaking 9.5. This voice recognition program is inherently based on artificial intelligence algorithms.

But artificial intelligence is not voice recognition alone. AI “perceives its environment and takes actions which maximize its chances of success.” Weizenbaum, clearly a gifted man, gave up computer programming and the field of artificial intelligence altogether later in life. In a way, he was better off than two recent leading authorities in the field. However, based on what I’ve read about him and his work, he really felt that humans shouldn’t rely on machines for decision-making. Of course, now, we do that every day. Pilots use AI to fly airplanes. People rely on AI in their cars without even knowing about it. At some point, people will rely on AI to make decisions about their lives. In one’s PDA, one will have a virtual psychiatrist/business planner/personal coach always at one’s fingertips. I suspect that Weizenbaum would not approve of this, but I think it’s better than, say, relying on Astrology for that same advice.

Weizenbaum was particularly put off by the fact that when ELIZA came out, people really took it seriously. Some people really couldn’t distinguish a simple pattern recognizer from a human being. If you have played with ELIZA, you may find this hard to believe. But remember it came out in 1966. No one had much experience with such things back then. No one had much experience with computers at all. This is how far ahead of his time Weizenbaum was.

At the same time, I think his dismay at how stupid people can be was misplaced. Artificial intelligence, in the form of an interactive program designed to pass Turing tests has not progressed much in the last 42 years. That is not to say that artificial intelligence has not progressed much, it has. However people have not built an interactive program designed to mimic humans with much more efficacy than ELIZA does. There have been some recent attempts and perhaps this is now becoming vogue once again.

Instead of worrying about people who can’t distinguish a computer program from a real person, think about how much these people can be helped. Look how many of the videos on YouTube are made by people who cannot see the consequences of their actions. A quick check with one’s personal digital mentor, might prevent the disastrous outcome from the typical “hey y’all watch this” YouTube adventure. Or Weizenbaum, who was clearly smarter than I am, could be right and such AI development could lead to a Terminator-style apocalypse. I suspect, however, no matter the outcome, such AI is inevitable in time.

FreeNAS versus the NetApp Storevault S300 and the Iomega Storcenter Pro-Nas 150d.

February 23, 2008

In the latest issue of PC Magazine, (the link is a past review on the site) they reviewed NAS storage options.  I have to say that I am surprised at the cost of these boxes.  The NetApp is $2500 and the Iomega is $1700.  Both have a storage limit of three TB.  The NetApp gets transfer rates of around 64 Mbps at best.  The Iomega is slower.

The FreeNAS box I built with 4 TB of storage cost less than both boxes.  The upper limit on storage is significantly higher.  Transfer rates can be over 100 Mbps.  In short, a FreeNAS box with old hardware is far superior to these devices and costs less.

They also review and a Windows home server appliance but that’s not even worth mentioning.  I am surprised that dedicated NAS devices aren’t all benchmarked against a FreeNAS server.  It is clearly the home and small business standard.

Building a FreeNAS file server with 4 1 TB drives.

January 21, 2008

 I ordered parts for a new server today.  With my success in building a FreeNAS based box, I’m going to try to build another one using 4 1 TB drives in RAID 5.  If it works, I’ll finally be able to put all my data in one place.  Oh and I also ordered a metric butt-ton of Cat 6 cable to try to help increase the speed of my supposedly gigabit network.  Goodtimes are ahead!  I decided to wait on building a new quadcore flagship machine, perhaps as late as until May.

I’m really scavenging my KVM lately.  One of the 4 boxes went to build my 1st server, now my Ubuntu machine is being sacrificed to build the second server.  Yet another box is going to be moved behind the TV in the basement to act as a videoserver for that.  I’ll only have one box left!

Maybe in May, instead of just building one flagship machine, I’ll build 4 and fill up my KVM with bad-ass processing power!  Muhahahaha!  Skynet will be born in my basement…

Best Practices for iTunes?

July 13, 2007

I can’t tell you how many times I’ve lost my playlists.  In the year and a half or so that I’ve had my iPod, I’ve always been moving iTunes from one machine to another.  This is usually because the machine breaks.  For the stuff that I’ve bought, I have maxed out the authorized computers.

A month or so, I had the opportunity to buy a little laptop that I can pretty much carry everywhere.  It has now become my iPod server.  All music I acquire goes on this machine.  The iPod charges and syncs with this laptop only.  I can use the automatic music management and I am happy.

When I think about all the music that’s been lost over the years, it’s frustrating.  I don’t think digital music is a better way to keep it than CDs.  Of course I lose CDs too or they get scratched or whatever.  I’m hoping now, with one solid repository of music it will be better.  Don’t get me wrong, I’ve done backups over the years, that doesn’t stop stuff from getting lost.

How do other people do it?  How do you manage your iPod?  How do you keep from losing playlists?  How do you keep from losing songs over the years?

RAID 0 and Repairing Windows XP.

May 29, 2007

I can be impatient with computers.  When they don’t shut down quickly enough, I’ve been known to hold down the on/off switch causing them to shut down prematurely.  Or I just unplug them.  What the heck, right?  If it causes a .dll problem or some other windows problem, just copy the file to the hard drive and you’re back in business right?

Not if you have your boot drive set up in RAID 0.

I get impatient, right?  I like a machine that boots fast because I don’t like to leave my computers on when I’m not around.  People like to try to hack my boxes.  Or put spyware on them or whatever.  So I turn my computers off when I’m not using them.

Well if you get two Raptor hard drives and you set them up in a RAID 0 format, your machine will boot really fast.  Great for me, since I’m, well, you know, impatient.

Unfortunately, if you corrupt these drives like I did last week by forcing an early shutdown, you’re screwed.  I couldn’t get XP to recognize the partition because it was corrupted.  I lost everything.  Which is to say that I didn’t lose much.  I keep most of my important stuff on a non-Raptor drive.  I just use the Raptors for XP.

But it was still a giant pain.