Wednesday, January 9, 2013

Why Should Anyone Use Linux

Quoting myself in a comment I wrote on this Google+ post.

To me, a computer is a computer. But there are "political" advantages to using Linux. When you use any software you are investing time in learning the skills to use that software, and perhaps more importantly, you are trusting your data with the people who made the software.

If you agree with that, then ask yourself, which group of people would you trust more?

  1. a company that is under contract to keep your data safe until you switch to their competitor, you stop paying them, or until the contract expires, and you are restricted by the contract to only use the software for the explicitly allowed purposes, or
  2. a community of people who all work together to make the software work, where all the people have in common the need for the software to work reliably because they all invest their time and trust into it, and everyone benefits from the software being used by more people because there is strength in numbers, so they encourage you to use it freely and in any way you would like.
If you need a computer only to do a job, and once the job is done, you can throw the computer away along with all of the data and just move on to the next job, then (1) is your best choice. But if your computer is more of a home to you than just a tool, and your data is more like your artwork, as opposed to just data, then (2) is your best choice.

Linux is choice (2).


I would like to add here that I am also a stubborn believer in free markets, and I since I so thoroughly disapprove of companies like Microsoft and Apple, I like to think that if enough people have this "we'll never shop there again" attitude, we might yet influence these companies to become better, even though I am perfectly aware of how naive this is. Still, for the sake of justice, I refuse to give even one cent to either of those corporations.

I've tried numerous times to explain to not-so-technically inclined people the advantage of Linux, but no one really gets it. I'm a professional, I know what tools are the best, and I know Linux is the best tool. But I can never really explain why it would be better good if not just experts but everyone used it.

But after trying for so long, you get better at explaining through practice. This comment I wrote on Google+ somehow strikes me as the best explanation I have yet written, and I was disappointed to it drowned out by so many other comments so quickly, so I posted it here. I hope more people see it.

Wednesday, December 26, 2012

How to mount partitions in an ".img" file.

Files that end with ".img" are byte-by-byte copies the information on a hard disk, or "images" of the disk. These files are a bit different from ".iso" files in that they are images of filesystems intended for fast random access, especially hard disk drives, and flash drives. They almost certainly contain a FDISK partition table. IOS files, on the other hand, are an image of a CD, DVD, or BluRay disk, and so they have a rather different internal data structure.

Images can be created with the Unix/Linux command "dd", to do a quick dirty backup of an entire system, including the partition table and boot sector. You can use these images to copy them directly to hard disks to avoid having to go through the whole process of installing an operating system for every disk.

Embedded systems like Raspberry Pi, Pandaboard, and Beagleboard often build an operating system image and distribute it as an ".img" file. Intel's now defunct Mobile Linux "Moblin" distributed their operating system in this way. At one point, so did Ubuntu, for their "Netbook Remix."

How can you look at the contents of these image files? Is there a way to "plug-in" these image files and see the files inside, as if you were plugging a USB-stick into your computer? With Linux, of course there is!

As root, using "mount"

Find where the mount point begins by looking at the partition table. You need the offset in bytes. The "parted" command "unit B" sets units to bytes when displaying partition tables.

 % printf 'unit B\nprint\nquit\n' | parted raspberry-pi.img
 WARNING: You are not superuser.  Watch out for permissions.
 GNU Parted 2.3
 Using /home/ramin/boot-disks/raspberry-pi.img
 Welcome to GNU Parted! Type 'help' to view a list of commands.
 (parted) unit B
 (parted) print
 Model:  (file)
 Disk /home/ramin/boot-disks/raspberry-pi.img: 1939865600B
 Sector size (logical/physical): 512B/512B
 Partition Table: msdos

 Number Start     End         Size        Type    File system  Flags
  1      4194304B   62914559B   58720256B primary fat16        lba
  2     62914560B 1939865599B 1876951040B primary ext4

 (parted) quit

You can mount a partition in a file using the loopback option, giving the offset of the partition.

 % mkdir ./part2
 % sudo mount -o loop,offset=4194304 -t msdos ./part2
 % sudo mount -o loop,offset=62914560 -t ext4 ./part2

The "-t ext4" parameter is optional for "ext" filesystems.

The advantage of using "mount" is that you can read and write files filesystem, you can change anything, and the changes stick. The disadvantage is you need to be the superuser, but this isn't an issue on most people's personal computer.

As a normal user, using "grub-mount"

Grub has utilities which can make use of the "fuse" library, and the loopback device. First, you need to find out how Grub refers to the partitions in the image (you must use full paths here):

 % grub-fstest "$PWD/raspberry-pi.img" ls
 (loop0) (loop0,msdos2) (loop0,msdos1) (host) 

The "grub-fstest" has a "ls" command which prints each item it finds in the device hierarchy. The "host" device is your host operating system's filesystem, ignore this. Since you specified a regular file as the device to test, the device hierarchy is "loop*", in this case loop0 because 0 is currently the first available loopback device on my system. Grub found two partitions in the file, which are now listed as "loop,msdos1" and "loop,msdos2". These will be the values used for "grub-mount".

 % mkdir part1; mkdir part2
 % grub-mount -r loop,msdos1 "$PWD/raspberry-pi.img" "$PWD/part1"
 % grub-mount -r loop,msdos2 "$PWD/raspberry-pi.img" "$PWD/part2"

Now, these paritions are mounted as read-only filesystems through the loopback device.

The disadvantage of using "grub-mount" is that you need the "fuse" library installed on your system, and that you have read-only access. You cannot modify the files or directories. The advantage is that you don't need superuser privileges to do it.

Monday, December 10, 2012

Computer Hacking Scenes from the Movie "Skyfall"

After watching the James Bond movie "Skyfall," one scene where the character Q is "hacking" the villians computer showed him analyzing a program which has had it's code obfusticated. He seems to have analyzed the call graph of program, and has visualized the graph in 3D. He mentions that every time he tries to "access" the code, it changes shape.

Once I got over the perposterous misuse of terminology like "security through obscurity" and "code obfustication" and "access," I calmed down and let my imagination run wild for a bit with the science fiction I had just witnessed.

First, it is possible that this "code obfustication" was a graph rewriting algorithm, hence the "rubix cube that fights back." But a skilled analyst would probably try to freeze an image of the code in memory after a single rewriting iteration, and use a graph analysis technique on the code that would not require executing the code so as to prevent it from shifting it's shape.

However some graph analysis must, by necessity, partially execute the code to analyze it. There are situations where a branch of execution is decided entirely by an arithmetic computation. You can narrow-down the number of possible branches by analyzing the possible values that could be plugged into the computation to determine reasonable bounds on the resultant values that choose the branch. But if you want to know exactly which branch the program takes, the only way is to actually execute the program up to that branch and then observe what it does next.

For example, you have a large array, the array contains a reference to every function in the program. As you trace the program execution, you arrive at one function where the inputs to the function are passed through a hash algorithm to produce a new key value. This key you got from the hash is then used to select an element from the array, and then the function selected is called. Your call graph now has a branch pointing to every function in the entire program, which is useless to someone trying to figure out the flow of execution.

Now, what if the entire call graph of the program is determined in this way? That is, every function call in the program is determined by hashing input values and selecting the next function to call from the array by the hashed value. Sure, if the hash is unpredictable, your program will just bounce around through every function in the array, not really doing any interesting work and probably getting stuck in an infinite loop eventually.

But what if your hash algorithm was specially designed in such a way that the function calls were not unpredictable? What if your hash algorithm was so perfect that the correct inputs would result in a call graph that actually performed a specific algorithm, performing useful work?

Well, if your "hash" is really just ordinary arithmetic on the input values to the function, that wouldn't be any different than any ordinary computer program. It would have to be more like a hash tree with loops, an initial value is hashed, the result is passed to the function and used to make a choice, the choice taken is passed a hash of the previous hash value, and so on.

This is indeed far-fetched at best, and impossible at worst, but I am curious. I am reminded of Lie groups, and if a program could be expressed as a Lie algebra, and every possible call graph of the program were a local algebra of the global Lie group, for example. I don't even know if that made sense, but I am envisioning every possible initial value producing an entirely different call graph, but these graphs are all somehow related mathematically due to the nature of the hash algorithm.

Of course, why go to all this trouble? If the program doesn't have the initial hash values somewhere, the program won't be able to execute itself. If you include the initial values, an analyst would eventually figure out exactly how to generate the call graph and your program wouldn't be so obscure anymore. Maybe you intend to install the program and initialize it with the correct hash values by hand when you intend to execute it. But then, why not just encrypt your program with an asymetric key, and only decrypt it when you are ready to execute it?

Tuesday, October 23, 2012

A Seasoned Ubuntu User, Using Fedora for the First Time

I love Ubuntu, really, even now after my recent problems. And I can't wait to install Ubuntu 12.10 "Quantal Quetzal" on my personal netbook which I use every day. But I have recently switched my office computer to Fedora for a variety of reasons. TL;DR, I really like Fedora.

After preforming a routine software update, I found "GRUB" booting my computer into a kernel panic. This and a host of other recent glitches prompted me to make the switch. With deadlines looming and only my personal netbook to get my work done, I decided that as long as I was going to be doing a fresh install, I might as well switch to something more stable.

Don't get me wrong, Ubuntu is very stable, but not compared to Red Hat. Red Hat's business depends on stability for it's use in enterprise servers at multi-billion dollar companies, unlike Ubuntu, which depends on attractive design and ease of use for individual consumers, regardless of how it works under the hood.

First of all, Fedora 17 has support for the latest stable 64-bit Linux Kernel, at this time they are at 3.6.3. Ubuntu was stuck at 3.2 for the past 6 months. Ubuntu 12.10 now uses Linux kernel version 3.5.3 -- still not at the latest stable release. Like I said, Ubuntu's goals do not so much concern what is under the hood as to how it looks on the surface.

The 64-bit kernel seems to run much faster on my Intel Core-2 computer. I don't have any hard data to back this claim, I simply noticed that cleaning and re-compiling my software projects takes noticeably less time now. But when I use a cross compile GCC toolchain linked against the 32-bit glibc, then it runs as slow as it did under Ubuntu.

One thing I learned was that the 32-bit glibc is not installed by default under the 64-bit Fedora, and if you don't have it, older software compliled for 32-bit glibc will not work. This caused me a bit of grief when my cross-compile toolchain didn't work anymore. But after a bit of Googling, I found the solution. It is really easy to install 32-bit glibc, just type: yum install glibc.i686 and boom, it works.

I also learned that most software packages come in 32-bit and 64-bit versions: the 32-bit versions can be installed by typing yum install package-name.i686 whereas the 64-bit versions are installed by default, but you can install them explicitly by typing yum install package-name.x86_64. This is a big difference from Ubuntu where only 32-bit Linux is supported and therefore this distinction is unimportant.

I also learned that you can install multiple versions of a package using yum by specifying the version number after the package name, but before before the ".i686" or ".x86_64", so yum install package-name.1.2.3-r45 or yum install package-name.1.2.3-r45.i686. However, when you use "yum erase", it only works on the latest version. This was another thing I learned: you don't use yum for everything sometimes you use "rpm". This is different from Ubuntu, where you use "apt-get" for pretty much everything, and hardly ever use "dpkg".

In fedora, "rpm" tells you which versions of what packages are installed and lets you delete older versions. List the versions you have installed with the command: rpm -q package-name --listduplicates. I kept having to look at the manual page for "rpm" because I kept typing rpm -d for "delete". NO, it is rpm -e for "erase". Also a bit confusing was the fact that you can install an ".rpm" file by typing the filename: rpm -i /path/to/package-file.1.2.3-r45.i684.rpm, but to erase the package you need to know the actual name of the package, not the ".rpm" file's filename. So if your "package-file.rpm" installs a package called "foo-bar-baz", you need to earse "foo-bar-baz" and not "package-file."

Now that I have got used to the differences between "yum" and "apt-get", and "rpm" and "dpkg", I am starting to really like Fedora's software repository, and Yum has been much more fun to use than "apt-get". With "apt-get" I get frustrated and often resort to using "aptitude's" nice menu-based interface. For some reason, I never feel the urge to use a menu-based alternative to Yum, I think the commands to search for packages and list installed packages are just easier to use. But it could just be the novelty makes it seem more fun to me.

One frustrating bit with Fedora is that public package repositories are much slower than Ubuntu's. At first, I thought I was having problems with the proxy because it was taking so long to download the updates list. But eventually the system was up to date.I guess that is an incentive to buy support from Red Hat. But it is frustrating. By the way yum upgrade is equivalent to apt-get upgrade, so it is easy to remember.

Since I bought my netbook in early 2009, I have gone fully Ubuntu Linux, not using Windows or Mac OS for anything, not at home, not at the office. Since mid 2010, I switched my work computer to Ubuntu as well. But unfortunately, Ubuntu isn't really intended for software developers. They make subtle changes, like how software links to glibc. They also set their "/bin/sh" to point to the lightweight "dash" interpreter, which effects how the C "system()" function works. Dash is actually more POSIX compatible than Bash, but most of the build scripts I use assume you are using Bash and all of it's non-POSIX functionality. Using "Dash" can cause very random problems to crop up.

These tiny changes can actually throw a wrench into the works if you compile large, complex source code trees that have been maintained over tens of years, all the while being written with the assumption that "/bin/sh" links to "Bash," and that the linking rules decided by GNU "ld" default to the same linker options used to compile "glibc" but Ubuntu has since changed how the linker options to "glibc" which changes how every other dynamic library is linked with "ld".

In all, I owe so much to Ubuntu for how they introduced me to the world of Linux. But now that I have graduated from "Linux Lieutenant" to "Linux Captain," Fedora might well be what I use from now on. I haven't switched away from Ubuntu yet though. I really like Unity. Although Gnome 3 is very nice, and I have had no problems using it, Unity is unquestionably better than Gnome 3. I especially like the Global Menu, I was a life-long Apple fan-boy before I switched to Linux and that global menu is most comfortable to me. Plus, my tiny netbook screen is easier to use when I use all of my screen realestate for content, no space wasted on window decorations or menus -- this isn't possible in Gnome 3. So I will continue to use Ubuntu on my 32-but Intel Atom netbook, for now.

Open source it, or it won't exist for long.

Open source software is the most important thing for your computer. It means you have full control over all of the data on your computer. Linux (and other less popular open source operating systems) offers this control, where Windows and Mac OS and iOS emphatically DO NOT give you this control.

Why is this important? The simplest example I can think of is this: lets say you make a photo album on Pinterest or Facebook, or perhaps made with some fancy, easy-to-use software on your Windows computer. Ten or fifteen years from now, chances are you will not be able to get those picture back. On the social network, those photos will be buried by thousands of more recent photos. Even if you can find them, how long will it take for you to move those photos to another application? Will you have to click on each one, copy or download it, and paste or re-upload it? How long will it take for you to do that with 15 years of photos?

The album you made and saved to your Windows computer's hard drive or iPhone (unless it is a PDF file) will be useless because three PC's and how many smart phones later, the software you used to make it no longer exists, and you have nothing with which to view the files. All your photos will be trapped in that data file that doesn't open no matter how many times you click it. And unless you are rigorous in keeping your paper-printed photos, you will wonder why so few photos of your youth survive today.

What happened to that software? Well, maybe the software is owned by some brainless company that cares more about their bottom line than keeping you happy, especially if you haven't been buying all of their latest software and keeping your files up to date. Or maybe that software was intellectual property that was bought out by some other company who decided it wasn't a profitable product line and they no longer make newer versions for newer computers and smart phones. Or maybe you changed from iPhone to a Windows Mobile device, and the software you used to make it doesn't have version for your newest smart phone.

With open source, you are always guaranteed to have this data, regardless of the software you use in the future. Instead of printing everything, keep backups on an external hard drive, and you will be able to view, edit, and share the data you create forever. Open software isn't owned by some brainless company, it is just out there on the Internet, waiting to be downloaded and use by anyone who is interested. Your data isn't trapped in some social network, where the only way to view or edit that data is by clicking on every single one. The data is right there on your computer in a file you can view, edit, and share.

This goes for more than just photos. It goes for movies, stories you wrote, presentations, spread sheets, anything that is your data.

Why not just keep buying Microsoft Office or Adobe Photoshop? Well, consider how much these software packages cost (hundreds, sometimes thousands of dollars) compared to open source software (most of which is completely free). What sense does it make to pay full price for every new release, or for ever new computer you buy? What happens if you decide it's not worth the cost anymore? What will happen to your data when you stop paying for every new version of the software that made that data?

Please realize that if the digital data they create is important to you, open source is the only way to guarantee it will always work without paying continuously for upgrade to the software that created it. Think ahead, open source is the only way to go.

Wednesday, July 11, 2012

A Brief Reflection on Linux and the History of Personal Computers

Linux has fought its way up, and is now the defacto standard operating system. If you intend to continue your career as a software developer, learning Linux is absolutely essential.

If you are not a developer, but haven't tried Linux yet, you really need to try it. Just download Ubuntu, It's free. "Tech support" is community driven so when you post a message on their online message board asking for help, you will get a reply from a real person who once had the same problem as you. You can also buy more proactive support.

You own the Data You Create

The most important reason to use Linux is that your data belongs to you. When you change from Windows to Linux, you may loose some of the data you created because you will no longer be able to use the software that created that data -- that software only exists on Windows. What happens when you want to use a computer made by a competitor of Microsoft? You will never see that data ever again, Microsoft won't help you move to a competing platform. This is explicitly Microsoft's strategy: if their customers find it too difficult or too costly to use anything but Microsoft software, then they don't have to compete with other software companies. For the longest time, Microsoft simply never had to compete in the desktop computer world, and this is why they have consistently failed to innovate.

This is not true going the other way. Take for example Ubuntu and Android, which are both based on Linux. You can switch back and forth between them without losing your data, because software written for Android can (and usually is) made to work on Ubuntu, and vice-versa. This gives you, as a consumer, much more freedom and choice. Any operating system based on Linux will have this freedom.

There are laws governing the Linux world, colloquially referred to as "open source" software. This law essentially requires that of anyone who writes software that they must always provide the option of making the software run on a different platform. This means, even if the original author of your favorite software is unwilling to make it work on that new computer you bought, fret not for as long as software is useful, you will be able to find at least one person smart enough to keep that software running on Linux, and the law guarantees that this person will have that freedom to do so. It is like a lifetime guarantee on the data you created. The software you made it with will always be around for any operating system, so long as there is interest in it.

This quality is paradoxically provided to you by the the General Public License (the open source law governing Linux) which explicitly states that there is no guarantee of any kind associated with the software. Linux, and most all Linux software, always have attached to it the following legal disclaimer:

This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
Practically speaking, what this means is that programmers have the freedom to make the software work on you computer without worrying about being sued. It is this freedom that allows for innovation.

Linux is the Operating System, for Any Computer

Linux is the most used OS in the world, and is used almost everywhere, excepting only home desktop computers and laptops where Windows obviously still has a death grip -- and now even that is starting to change. It hasn't happened yet, but there is very real potential for, Android, Chrome OS, or Ubuntu to start making their way as a true competitor in the laptop and desktop world. Apple seems less interested in home computers these days (more interested in cell phones and tablets), and that leaves lots of room for Linux in the home.

In fact, Linux is already in the home on TiVos, making your TV set work, and it is the heart of the Android operating system so it may well be in your cell phone too. Linux runs the Google search engine, Linux runs Facebook, Linux runs the majority of all applications on the internet. Linux is just not the operating system on your desktop PC.

The Lost Decade of the PC

The desktop computer platform has been hurt the most by the innovation vacuum caused by Microsoft. People still think either Windows with MS-Office, or Mac with MS-Office are their only two choices. Many people still think software comes in discreet units that you can buy in boxes off of shelves at Wal-Mart and Best Buy. With this old way of thinking, you may have found yourself in this situation:

Yay! The computer I bought from Amazon.com has arrived on my doorstep today... but I need anti-virus software. I'll guess I'll just pick one up next time I am at the mall.

Now, Windows is starting to fade away, and the iPhone and Android "app stores" have taught people just how ridiculous this old view of software is. Microsoft is fully to blame for making you think things would have always been that way. Now we know better: you don't buy software in boxes. This is how the world works now.

And incidentally, if people are going to pay for an operating system, then anti-virus should be the responsibility of the software maker, not the consumer. The Warranty on Microsoft Windows disavows any responsibility to any loss of data caused by viruses. The difference between Windows and Linux is, therefore, that you pay money for Windows, but if you ever get sick of dealing with Microsoft, you aren't allowed to change to a competing operating system without losing much of your data. Sure, Linux doesn't guarantee protection from viruses either, but at least you can move your data from backups onto another operating system if you want.

Things have evolved, but Microsoft has not. In fact, Microsoft has just signed a new agreement with all of their OEM's to make it impossible to install Linux onto their computers unless you disable the "Secure Boot" security features which they are now bullying the OEM's into installing into their hardware. Microsoft shows no sign of wanting to actually try and compete with Linux by innovating.

The simple fact is, they cannot compete; Microsoft does not have what it takes to survive in the software industry anymore. They are merely coasting on the momentum of general consumer ignorance, the popularity of their Windows, Office, and XBox, product lines, and by taking advantage of the patent and copyright laws that protect them from the competition.

The Bright Side

Personally, I am most thankful that Steve Ballmer has done such a terrible job, he has effectively weakened Microsoft's death grip on the software industry. As a result, we are seeing a small explosion of innovation, particularly with the idea of cloud applications and app stores. Nowadays your average Joe knows there is more choice than just "Mac or PC", now you have Android and Ubuntu in the game as well (although Ubuntu is still most unfortunately _not_ a household name).

If software patent laws weren't so thoroughly hindering innovation in software, we could see what the computer industry mature to its full potential, we could see what computers are truly capable of. But we are fortunate to have a fleeting glimpse of that full potential now that Microsoft is much less the monster it used to be. The software world we see today would have happened much sooner had people not been convinced that Microsoft was the only operating system worth using all throughout the 90's and the first half of the last decade.

To be honest I hope this new Apple CEO, Tim Cook, is to Steve Jobs what Ballmer is to Bill Gates. If Apple goes down, and Microsoft stays down, and if software patents are decided to be illegal, then you won't believe how much competition and innovation we will start to see. I think it is completely possible that the resulting technology could improve the quality of life for every person on the planet, if only it had that chance.

Sunday, September 11, 2011

10th anniversary of the murder of over 3000 people

Before the day September 11, 2011, comes to a close, I thought I might take a bit of time to remember. I offer my sincerest respects and sympathies to those who survive the victims. I was 18 at the time.

TL;DR: The rest of my pondering is not so much about 9/11, than about the political turmoil that ensued.

Although I was enraged at the atrocity, I was probably more enraged that to the 24-hour news corporations and to political pundits alike, it was the greatest thing in the world to see all those people murdered. Almost no mention of the fact was made that there were signs of such an attack on the horizon for years, and further precursors were largely ignored by the Bush administration in the week leading up to the attack. Instead of saying, "we should have seen it coming," everyone acted surprised and outraged, and basked in all the attention their news networks were suddenly receiving, and wasted no time assigning scapegoats: mostly liberals and Muslims.

An unpopular president instantly became very popular, and exploited his position as only a skilled politician can. If I were a soldier I would have hated to be sent to Iraq and risk being killed or mutilated by an IED every single day knowing full other soldiers were in Afghanistan fighting the real bad guys. And I would have hated all the military propaganda telling me every day that I was risking life and limb for the sake of the motherland, and that our campaign in Iraq would ensure peace and safety for the United States some day.

Still, given so many good Americans were working on military operations in Iraq, I suppose it was inevitable that an awful situation could be made better: Saddam Hussein was caught, and things have sort of improved in Iraq, despite all the other problems we caused and the countless lives lost.

Now, 10 years later, the real murderer was finally killed by an even more unpopular president. I have lost almost all respect for Obama over the past 2 years, but he does deserve respect for giving that order to actually catch Bin Laden. Cheers to the CIA and special-ops guys who actually caught Bin Laden. They clearly did their job well.

I wonder how hard Bush actually tried at catching Bin Laden? Was the intelligence on his whereabouts really so difficult to gather that it took 10 years to do it? This one TV show I saw suggests that Bin Laden could have been caught with a dedicated team of counter-terrorist experts, and single, nearly invincible agent with a flawless sense of patriotism and duty could have done the job in 24 hours. But I guess that show is more entertaining than it is realistic.

In the meantime, our homeland is increasingly falling under the control of religious fascists in the manner reminiscent of Iraq, Iran, and North Korea. Those of us who actually understand the importance of democracy, free enterprise, free speech, and the separation of mosque and state, er... I mean separation of synagogue and state, er... I mean separation of church and state, er... I mean the separation of RELIGION and state, have been trying, and often failing to fight back the fanatics with everything we can afford to throw at them. Fortunately, we have an excellent (but grossly over-funded) military, and I give thanks to those who serve to protect us from foreign invasion. Because of them, our greatest threat to freedom is not from abroad. The threat is only from within our borders... and the threat is not coming from the liberals.

Hopefully, we aren't turning into the distopian nation depicted in 1984, with their thought police, perpetual wars, and ministry of truth. The thought police is not necessary because your church community does the job of policing your thoughts quite well enough. The ministry of truth is not necessary -- we don't need a government-run propaganda factory because the free market has created News Corp and Fox News for you. And our perpetual war is... well maybe it will end someday.

So here is the historical narrative: this police state that is slowly enveloping us has been at least 30 years in the making. September 11, 2001 was only the tipping point -- the critical event that set off our landslide down the slippery slope. But as long as there exist people who agree with this narrative, perhaps we can avoid a police state after all, and climb our way back up the slope. Let's never forget that.