Tuesday, June 16, 2009

Preventing File Sharing for Fear of Illegality is Stupid

I came across this news article. I will start with a disclaimer that this article does not explicitly state any of the things on which I am about to rant. With that out of the way, I have a serious problem with networks and internet service providers (ISP's) killing all usability of torrents. They can do this through a number of ways, but I am not going to get into the technical details here. My problem is that they cannot distinguish between what is a legal file download and an illegal one. Someone could very easily use a torrent to download the latest version of Ubuntu which is quite legal. They might also be downloading the latest movie at the same time which is very illegal under most circumstances. However, it is not illegal under all circumstances. Thus, most files cannot even be filtered simply based on what they are. The reason being is a file download can be legal for one person and illegal for another person. The result is if you are going to attempt to seriously hinder file sharing to prevent software piracy, you are also preventing legal uses of it.

Most universities use bandwidth shaping tools to limit how fast a file can be downloaded via P2P clients. I will not fight too much against the argument in which a school wants to keep bandwidth available for web browsing which tends to be more conducive to school-related activities than file sharing. Though I will say a computer science major can learn as much by downloading the latest Ubuntu .iso as a sociology major can learn on Facebook. Regardless, when a university prevents file sharing to check its illegal uses since apparently college students are the number one software pirates in the world, things become stupid. I will admit most policies are put in place because some government agency is breathing down their necks, but stand up for yourselves already!

Why do I have a problem with eliminating P2P file sharing on a network? Let me ask you this: why do we not prevent FTP on those same networks? It can be used for illegal file sharing. It tends not to be used because it is less efficient, but it has the same illegal potential. However, I will point out to you in another way just how absurd this whole idea is. If Microsoft slanders Apple in a commercial on the ESPN channel on your local cable, who is responsible for the breaking of the law? Is it Microsoft for slandering Apple? Is it ESPN for allowing such an advertisement to be aired? Is it the cable company for allowing ESPN to air such a commercial? I am guessing most of you think it is Microsoft. However, preventing P2P file sharing because someone might attempt to acquire a file illegally is the equivalent of putting ESPN at fault. All P2P file sharing does is present the content. It is not illegal until someone uses it in an illegal way. Thus, any network or ISP preventing file sharing amounts to a coward afraid to stand up to the lobbying RIAA mongrels.

Why do I cite the RIAA as mongrels? They started by attacking the poor, defenseless college students sharing files. They tried to make it a responsibility for universities to give up their students who were sharing various files the RIAA was monitoring. Some universities fell for this gag. Some students were wrongly punished because of this. Then some people started to stand up to the RIAA, and we all laughed because they finally got what was coming to them. It turns out college students are not always as defenseless as people like to think they are. Counter lawsuits ensued. I will not go into the full history because I am not as well-versed in it as I would need to be to do it justice. So now the RIAA is trying to shutdown file sharing. Good luck with that one. There will always something along the lines of FTP at the very least due to the need to push files to servers. The internet would die without such a technology. The RIAA just happens to be stupid enough to believe it can somehow win this fight. In the words of Agent Smith, "you are only delaying the inevitable."

My biggest pet peeve about attacking people ranging from random people on the street to The Pirate Bay is the entire concept. Let us say I leave my school bag and a cd unattended on a table in the union at my college because I want to go get food and I have a friend coming by any minute to borrow the cd. Some random person walks by and steals the cd. You might argue I am stupid for leaving it there, but I do not think you would say I am at fault with the law. Am I accomplice in the stealing of my own cd? It seems a bit ridiculous does it not. I am allowed to loan my physical cd to a friend under law. The only thing I have done is left the cd in a place where it can be easily taken by someone who does not have legal rights to possess it. This is the same situation in file sharing. I am not asking random people on the internet if they want a particular file. I am merely leaving the file in a public place for appropriate people to pick it up.

If this is illegal, I argue any computer possessing copyright information or confidential data and connected to the internet is illegal. I will paint another picture for you. Pretend I have a server at home with various music files on it that I purchased through iTunes. This is obviously only hypothetical since I would never use iTunes. I decide to open SSH to the world so I can work on some things on the computer while I am at my grandma's house for the holidays. Have I broken a copyright law? Someone could illegally acquire the music by hacking my SSH login and obtain the file. The only difference here is I put a security barrier between the file and unauthorized users. However, if it is illegal to share files because people might download them illegally, then is it not also illegal to setup SSH? I am providing an opportunity for a hacker to illegally access my system. I should be ashamed of myself! The only difference is the amount of security I have setup in an effort to prevent unauthorized users. Just as I had no security to prevent my cd from being stolen off the union table. In case you are worrying, no, I have never had a cd stolen from me at college.

Just to drive the point home, if you want to be stupid, try to prevent file sharing because someone might use it illegally. Ooo! Maybe we should shutdown the internet because somebody might use it illegally!

Labels:

Thursday, June 4, 2009

FOSS

I was talking to my brother-in-law, Jamie, the other day about Free and Open Source Software (FOSS). It got me to thinking about how much FOSS I use on a regular basis. Here is a list of all the software I have come into contact with recently. If you realize I am forgetting something, please leave a comment.

Software Used on a Daily Basis:
Software Being Investigated:
  • Trac (Project Management Tool)
  • Zabbix (Network Monitoring Tool)
  • Wordpress (Content Management System)
  • SVN (Version Control System)
  • Elgg (Social Networking Software)
Software Used in the Past:

Labels: ,

Wednesday, June 3, 2009

Trac Review

I spent some time at the end of April playing around a little bit with a project management tool known as Trac. Then a friend asked me my thoughts on it. So I told him I would write blog about it. A month later, I am finally getting it actually done! Let me first qualify that I was looking at the tool to manage software development projects.

The Cool Things about Trac:
  • Lets you create milestones in the development process. It seems to be common to use these to represent version releases. My initial thought was to use them for various key components of the software.
  • Has a great ticket system for reporting bugs found in the software. These can be setup to allow your users to fill out tickets when they discover a bug. The tickets can also be attached to milestones to allow you to manage fixing them as well.
  • Integrates a wiki as well to allow you to document the software and its development process.
The Disappointing Things about Trac:
  • Installation is a series of commands that call Perl scripts if I am not mistaken. This made me sad. I wanted a pretty web interface to ask me the questions because it would make a lot easier for installing on my web host. I can survive, but I think it would be a big step in the right direction.
  • Along the same lines as the installation web gui, administration tasks need to be pushed completely into a web interface. Some things can be done in the web gui, but not enough. I was still having to run some commands from a terminal for very basic setup and user administration needs.
  • No concept of submilestones. I see milestones as tasks needing done. Tickets as bugs. From all the reading I did from the Trac community and developers, this does not appear to be the future idea of Trac. Right now, people create milestones and attach tickets to milestones. Tickets are being used to represent tasks. Rather than implementing sub-milestones, the direction is to implement sub-tickets. Personally, I just don't like this system, but I can roll with it if needed.
  • No forum integration as I recall, but I could be very easily be wrong on this one. It would allow your user community to talk with each other and help each other. This would be useful for all those cases that do not make it to the wiki or when a user simply does not use the wiki or find what they need on the wiki. It could also be used for communication between developers.
After all of this, I will likely still use Trac. I spent some time looking at other project management software, and this was the closest one to really handling what I needed. I did have the requirement that the software was free and preferably open source. I do not intend for this to be a large implementation so the unpolished aspects are not as big of a problem. Things were still easy to figure out so I did not have to spend much time trouble-shooting or pushing through a learning curve. Ultimately, if you are looking for something like this, feel free to give it a solid look and see if it will work for you.

Labels: ,

Perl => Micro$haft Rant

The other day I tweeted in a mad fury with very little details as to the source of my frustrations. To enlighten my Twitter followers, here is the email which served as the source of such anger:
It turns out that when you pass an array as an argument in Perl, it flattens the array, thus giving you a lot of arguments.

Example:
@array = (1, 2, 3);
Print_array( @array );

Sub print_array() {
My ($1, $2, $3) = @_;
Print "$1\n";
Print "$2\n";
Print "$3\n";
}


Rather Than:
@array = (1, 2, 3);
Print_array( @array );

Sub print_array() {
My @nums = shift;
Foreach my $num (@nums) {
Print "$num\n";
}
}


Unfortunately this really screws things up when you want to pass an array along with some scalars.

Such As:
@array = (1, 2, 3);
$sum = 6;
Print_math( @array, $sum );

Sub print_math() {
My (@nums, $math) = @_;
Print $nums[0].'+'.$nums[1].'+'.$nums[2].'='.$sum;
}


Working Solution:
@array = (1, 2, 3);
$sum = 6;
Print_math( \@array, $sum );

Sub print_math() {
My ($num_array, $math) = @_;
My @nums = @$num_array;
Print $nums[0].'+'.$nums[1].'+'.$nums[2].'='.$sum;
}


Ignore the auto-caps. Have I ever mentioned I hate M$ products?

Microsoft: "We're so smart we know how to capitalize sentences. You must be too stupid to know how to do this like we do. We will help you capitalize your sentences."

Me: "But I'm not typing sentences..."

Microsoft: "What do you mean? People only type sentences in their emails. No one except us understands code or other non-sentence based writing."

Me: "Go #@$* yourself Micro$haft! You're #@$*ing retarded! You're 'increase indent' button claims it can't work with plain text emails even though I clearly have tabs in this plain text email because I had to type everyone @*%&@() one of them myself!"

Labels:

Sunday, May 24, 2009

My Network

I spent most of my free time in the last week setting up computers the way I want them with some new technology. I had been eying the ideas for a while, but school had kept me too busy. I decided to write-up my current setup, and then I will extrapolate on each individual system in due time. Anyways, here is the general rundown on my network.

Trogdor (Ubuntu 9.04 Desktop Edition)
The computer I am on the most is my laptop. I use it as my workstation. All my regular applications are run on it. These include Pidgin (instant messenger), Thunderbird (email), Firefox (web browser), Eclipse (IDE), Spaz (Twitter) and Open Office (documents). Eclipse handles all my software development from C++ to PHP to database design. I also run a local LAMP (Linux, Apache, MySQL, and PHP) server on Trogdor for easy testing of any web development.

Ellinore (Ubuntu 9.04 Desktop Edition 64-Bit)
Ellinore is my file server for the most part, but it is gaining an emphasis as a media server. She is hooked up to my tv allowing for easy playing of videos. Also is running Mediatomb, a streaming media server, to allow me to play music and videos on my Playstation 3.

The Beast (Ubuntu 8.04 Server Minimal Install)
This is a very old computer I managed to pick up from my parents for free. It essentially amounts to my sandbox. I configure various services on it to get a feel for them. I will also see what it takes to hack them just to get an idea for how secure a configuration really is.

Playstation 3 (N/A)
My Playstation 3 has no other operating system on it. I use it simply for playing videogames as well as watching videos or playing music. Some day I may put another OS on it, but for the time being, it is just there for pure enjoyment.

Linksys WRT54GL Router (Tomato firmware)
My router is a nice Linksys router with A/B/G wireless coverage and four 100mbps ports. I have the Beast, Ellinore, and the PS3 wired to it and connect to it over wireless with Trogdor. It is currently running Tomato rather than the Linksys default.

Labels:

Thursday, April 23, 2009

Compiler Project Review

I am currently in a Translation of Programming Languages class. The primary focus of the class is on building a compiler. My class, all six of us, decided to be one single team. We then decided to create our own language which we would compile to Java bytecode. We have been severely behind schedule the entire semester. Recently, I have been thinking about some of the reasons why.

I believe the largest problem was our use of the waterfall lifecycle model. It was not our intention, but we fell into it nonetheless. This caused us to attempt to develop components from start to finish without error on the first try. Not a good idea. The other problem is we have six people on the team. There are not enough different pieces to each component for all six people to be working on one component. An agile pursuit would have suited us far better. Towards the end of the semester as we have started to make progress, I have seen us shift a little more to such tendencies. The nice part about it is it gives us more parts to work on. This allows us to divide the team up rather than clustering on one component and getting in each other's ways.

The second largest problem we are having is lack of documented project management. Everyone is kind of off working on their own random thing and we randomly bring it together in the svn. However, there is no rhyme or reason to what different people are working on. We have a wiki for documenting decisions, plans, et cetera, but we have not made proper use of it. It probably would have been a better decision to use some sort of project management software to accompany the wiki. One example of the software I am referring to is Trac. This would have let us keep better track of our different requirements and bugs. From there, we could have potentially had a better idea of who would be working on what.

We have had a few other problems along the way such as our use of Subversion for version control. That being said, these other problems have been minor in comparison to the two problems I discussed above or they have been a consequence of those problems. Moral of the story: Software Engineering and Project Management is important!

Labels: ,

Tuesday, February 17, 2009

Programmer Jokes

I was recently emailed this site of programmer jokes. I had to pass it along because some of them are hilarious. Here is one of my personal favorites:

Jesus and Satan have an argument as to who is the better programmer. This goes on for a few hours until they come to an agreement to hold a contest with God as the judge. They set themselves before their computers and begin. They type furiously, lines of code streaming up the screen, for several hours straight.

Seconds before the end of the competition, a bolt of lightning strikes, taking out the electricity. Moments later, the power is restored, and God announces that the contest is over. He asks Satan to show his work. Visibly upset, Satan cries and says, “I have nothing. I lost it all when the power went out.”

“Very well,” says God, “let us see if Jesus has fared any better.”

Jesus presses a key, and the screen comes to life in vivid display, the voices of an angelic choir pour forth from the speakers.

Satan is astonished. He stutters, “B-b-but how?! I lost everything, yet Jesus’ program is intact! How did he do it?”

God chuckles, “Everybody knows… Jesus saves.”

Labels:

Friday, February 13, 2009

C/C++ in Eclipse on Windows

I recently helped a friend setup Eclipse on Windows Vista to develop C/C++ code. I remembered how painful it was for me the first time, so, I thought I would write a blog detailing the process to make life easier on others. Thankfully, the process has been greatly simplified since I first tried it a few years ago.

I. Download Eclipse

I am going to assume you do not already have Eclipse on your system. If you do, simply go download the CDT package as an add-on to accomplish this step. For those who do not already have Eclipse, you can get it simply by going to the Eclipse website at http://www.eclipse.org. Then click on the Downloads link. Here you will likely see a listing of several different packages. You will want the "Eclipse IDE for C/C++ Developers." All that has to be done then is to extract it to wherever it is going to be kept. I tend to put it in c:/program files/eclipse/, but that is just personal preference. To run Eclipse, simply execute eclipse.exe.

II. Install MinGW

I am going to provide a link to a file, but I highly recommend grabbing the file from their website in case anything new has been released. The reason I am providing a link at all is because they do not currently have an easily distinguishable link on their own website. Anyways, their website is currently located at http://www.mingw.org. From there, I have to scroll down until I see "HOWTO Install the MinGW (GCC) Compiler Suite" on the left under "Popular Content." Then there is a link under the "Using the Installer" category to this page. This installer saves a lot of hassle. Download the .exe and run it. I recommend the default location which is c:/mingw/ last I checked. Make sure to install GCC Core Components, C++, and MinGW-make.

Right now, you should be able to run the command c:/mingw/bin/gcc and get a polite error message about no input files. This means MinGW is working! However, to make it easier to use, you will want to add the mingw bin to your system path variable. How do you do this? That is exactly what I am writing this for. Open a command prompt by either pressing <windows key> + R, going to "Run..." in the start menu and entering "cmd", or type "cmd" in the search field in the start menu on Vista. Once you have a command prompt open, type
PATH = %PATH%;c:/mingw/bin/;

and hit enter. Now you should be able to simply type "gcc" and get the nice error message. Voila! MinGW is completely installed!

III. Configuring Eclipse

There is one final step: configuring eclipse to use mingw. Start up Eclipse and go to Window > Preferences at the top. Mine down through C/C++ > New CDT Project Wizard > Makefile Project. On the right, click on the tab "Builder Settings". Uncheck "Use default build command". This will activate a text field just below it labeled "Build command:" with the default value of "make". Replace it with "mingw32-make". Click "OK" at the bottom to save your settings.

Voila! You have just installed Eclipse to compile C/C++ code on Windows!

Labels:

Monday, February 9, 2009

Pushing Through

Eugene Wallingford just recently wrote this blog about embracing failure. The topic is one I often think about because it is an area I feel I tend to excel in above others. Thus, I thought I would provide my own opinion on the matter.

First off, when Wallingford says, "what other people call failure is learning to fly," he is wrong. According the analogy he is running with, failure is hitting the ground after jumping off a cliff. Taking the chance at hitting the ground in an effort to learn to fly is the same as risking failure in an effort to succeed. Just thought I'd throw that out there at the start.

My personal thoughts are it is not about embracing failure. Instead, it is about pushing through the failure until I finally reach success. That's why I did not title this "embracing failure" as Wallingford did. When I am struggling with a programming problem, I always define it to my friends as "beating my head against the wall." Please note that only occasionally am I actually beating my head against the wall. It is a fight. When I come up with a solution that does not work, it frustrates me further. If I never figured out a working solution, I would be left in a state of extreme frustration. This is one reason I rarely finish a programming problem unfinished for another day. It will irritate me and distract me from everything I do until I finish it. There is no embracing the problem. There is only beating it into submission.

What drives me to continue endure such aggravations? In some respects, it is about the struggle itself, but I intend to devote a full post to this idea someday so I will skip it for now. Ultimately, it is the euphoria after achieving success which keeps me going. The more frustrating problems are the most rewarding to conquer. I have been known to take off running down the hallway in my dorm yelling in excitement after discovering a working solution. It is because of this that I claim it is not about embracing the failure, but rather pushing through it.

My determination to excel is not only evident in programming. In high school, it was not uncommon for me to spend a couple hours out in 100 degree heat hitting a shopping cart full of tennis balls by myself just to work on a few shots. When I was in late elementary and early middle school, I would be out in my driveway shooting hoops quite literally until it was so dark I could not see the basket anymore. When I run or bike, particularly with long distances, I always end up going faster as I progress. It is not that I start off too slow, but rather that I become charged with the idea of reaching the finish line. In my Personal Wellness class just week, we had to jog a half-mile. I found myself almost sprinting by the last 200m. While playing videogames, I will repeat a particular task over and over again until I can finally beat it. Last year, it took me ten hours to be the final bonus mission on Call of Duty 4: Modern Warfare on the hardest difficulty. The mission was a maximum of 60 seconds long. You do the math.

Perhaps the best way to summarise my thoughts on the matter is with this well-known phrase:
"When the going gets tough, the tough get going."

Labels: ,

Wednesday, December 10, 2008

Stored Procedures vs. Prepared Statements

At work the last few weeks, I have been doing a lot with SQL in a web application. I will not say I am an expert in the matter, but I do feel I have a pretty good understanding of the two options. Ultimately, a stored procedure seems to be rather annoying than useful. The problem I have is that the languages RDBMS provide for stored procedures tend to be pretty bad as far as programming languages go. So there is a reason you have chosen to write your application in a particular language. Chances are it is the best for the situation and provides a lot more power than the meager implementation offered by the RDBMS. That means you need to get some advantage out of using stored procedures to make it worth your while.

So what advanatages can stored procedures offer? Protection from sql injection is thrown out because you get the same benefits from prepared statements. I have heard some arguments about performance gains. The two primary factors coming from caching the SQL statements along with the round-trip-time to the database server. From what I have heard, most RDBMS these days cache prepared statements just as well as stored procedures. Besides, on today's machines, you have to write the world's most complex SQL statement before you will notice the hit on the server. Meanwhile, the round-trip-time (RTT) can be a valid argument. If you have a lengthy series of statements with some simple logic to control them, there is some potential benefit to using stored procedures. This would depend on how minute your timing issues are along with how many calls of this nature you would make along with the length of a RTT to the database server. In most environments, the time from the application server to the database server is only a few milliseconds.

Ultimately, your average web application does not have strict timing requirements. It tends to be more dependent on the speed of development along with the ease of maintenance. it never makes a large enough difference to worry about it. When I am developing applications, I intend to lean towards prepared statements and will only shift to stored procedures when I absolutely need to.

Labels: ,

Tuesday, December 2, 2008

A Linux Success Story

So the other day, I received a phone call from my mother. My brother-in-law, Jamie, and I had just recently given her a "new" (used computer from my grandma) with Xubuntu installed on it. Then Thanksgiving break ended. So here's a paraphrased version of the phone call:

Me: Hello
Mom: Hello, is this a good time to talk?
Me: Um...(pause as I try not to die in Resistance 2 on the Playstation 3)
Mom: Well, I'll just make it quick then. Am I supposed to get something more than a black screen when I turn on the computer?
Me: Yeah...Do you see a green light?
Mom: Yes.
Me: Is it on the monitor?
Mom: Yes.
Me: Do you see a green light on the tower?
Mom: Let me look. No, I don't think so.
Me: Well you'll need to push the big button in the middle with the power symbol on it.
Mom: Give me a moment to find it. ... ... I'm not seeing it.
(Short period of time as I try to describe the big circle button smack dab in the middle of the front of the tower with the power symbol on it. I really don't recall how I did this as I was once again trying not to die while playing my game. Eventually, though, my mom found the button.)
Mom: Oh, I see something now!
Me: (heavy sigh because I finally died)
Mom: Well, it sounds like things are going so well. I should probably let you go.
(End of conversation)

The next day she called me about some random thing and mentioned how much she liked the new computer. This is the same lady who has a notecard with notes on it about how to turn on any computer she has ever owned. She implicitly said she liked a computer running Linux. Now if that's not a testimony to how easy Linux is becoming to use, I don't know what is.

P.S. I did have to teach her how to shut it down since the menu is in the upper right rather than the lower left...

Labels:

Monday, November 10, 2008

COG Developers

As some of you may remember, I wrote a blog a few weeks back about starting to work on a new website. Well, Champions of Gaming (COG) is officially under development. I am working on it with Mark Fedor (papasmurf786) and Chris Tuvera (eviltim11). They are both people I play videogames on a fairly regular basis. Mark I have only known for the last two and a half years as I recall. I grew up knowing Chris since first grade. Mark has some pretty impressive graphic skills and is starting to learn some basic programming concepts. Chris is constantly getting better and better with his graphics skills. He also has some programming skills; though, he has yet to delve too much into web development. For the moment, I am doing most of the backend development while working with Chris and Mark on the frontend.

As for how the development is going, we have officially bought the domain now. We are hoping to pick up a few similar domain names, but that is still hanging in the air at the moment. The look-and-feel is pretty much set, and I must say it looks pretty sexy. We have mapped out a few placeholders for some features we think will be cool. No real functionality has been put in place yet, but a lot of ideas are being thrown around and decided on. I do not know how much will get done before Thanksgiving as all three of us are in college at the moment. However, hopefully, we will make some significant progress over winter break.

Labels: , , ,