I'd like deviate from Go for a moment. These last few weeks I've been distracted from programming due to switching my main desktop from Ubuntu to Arch Linux and helping my wife with our home-based business Geekling Designs. The Christmas shopping season is busy enough as it is but we're also in the midst of some major upgrades to our business. What with setting up a new small-business ERP software system (an oxymoron, I realize), building a new custom light exposure unit and purchasing a commercial grade screen printing press, we've been a tad busy. And then there's the desktop switch.
As I stated in a previous article, I've been a long-time fan of Ubuntu. Say what you will about it, it allowed me to move from a casual Linux user to a full-time user. It gave me the experience required to build an LFS system for the first time and encouraged me to finally grow as a programmer and invest in Open Source Software.
I've now finished the process of switching over my main system from Ubuntu to Arch Linux. I elected to use LXDE for my desktop environment for something with a little more pep. Neither my wife or myself care too much for the flash of most modern desktops (KDE, Gnome 3) and LXDE can be made quite elegant in its own right without all the cruft that come with the other two environments. I considered using Lubuntu but I felt it was time to cut my ties with the whole Ubuntu ecosystem for a little while. It's nothing personal against Lubuntu, I run it on my venerable laptop and I've installed it on friend's computers, I just need my space. It's not you, it's me.
Of all the things that Arch has going for it, I must say that it's wiki documentation and forums are absolutely top-notch. It's rare I can't find an answer to my issues in either source. Even my wife has supported the change and has been impressed with the difference in speed between the two systems. I even changed to using the nouveau graphics driver. For regular desktop use, it appears to work superbly, much to my surprise!
Every system has it's quirks. Take, for example, installing the Eclipse IDE. Now, let me first preface this by saying this is less an Arch issue but an eclipse one. The version of Eclipse in the pacman repositories does not come with the Marketplace plugin installed. Nor could I get it installed. Needless to say, it struck me as odd that the version of Eclipse recommended by it's developers was not the one I found in the repositories. I'm sure there is a valid reason for it but it irked me none-the-less. I ended up uninstalling the Arch version and installing it from the Eclipse website. What can I say? I'm lazy and I like the ease of installing plugins from within Eclipse itself.
That said, the move has gone as smoothly as can be expected. No loss of personal data (thank's to keeping my /home directory on a separate HDD) and just the usual issues with installing required software and tweaking settings to work with the new system. Tough things usually "work out of the box" like they tend to do in Ubuntu some packages do need a little extra work.
That's the price you pay for freedom!
Showing posts with label linux. Show all posts
Showing posts with label linux. Show all posts
Wednesday, 16 November 2011
Friday, 2 September 2011
An Ubuntu User Testing Arch Linux
First, some background.
I run Linux on an older computer. Were I to have purchased it from a retail store I would likely call it "venerable" but since I built it from scratch as a gaming computer the old girl has a bit more kick. Irregardless, it's nine to ten years old and just can't handle modern gaming any more. Still, as a basic desktop machine, it works fantastic!
I have tried several Linux distributions over the years. I shudder to recall that my first distribution was Caldera Linux (see SCO to understand), which was soon followed by Mandrake Linux (now Mandriva) because of its easier customization. I was always in love with the idea of Linux but I struggled with the technical side of maintaining it, like upgrading applications from source. Installation, too, wasn't like it is today with distributions like Fedora and Ubuntu. You actually had to know something about your system. You usually needed to know every small detail about your sound card, if you could get it running at all. You needed to know every spec and detail about almost every piece of hardware in your system and that assumed it was compatible with Linux in the first place (Winmodems anyone?). It wasn't for the feint of heart and is likely how Linux got the reputation for being only for computer geeks.
I heard about Ubuntu early on but it's quirky release names initially turned me off. However, it's quick rise in popularity and numerous glowing reviews finally convinced me to give it a try so when it's fourth version was released, 6.06 Dapper Drake, I burned a copy and installed it. I loved it. Ubuntu altered my perception of Linux forever. I turned into a fanboy overnight and was suddenly encouraging anyone I could find to switch away from the diabolical Microsoft and finally live a life of Freedom! Even if I couldn't get them to switch from Windows I was at least pushing free software. What can I say? I was naive. My heart was in the right place, though.
Now, at version 11.04 I am starting to get frustrated with Ubuntu. I love all the software in their repositories. I love Launchpad and how you can set up PPA's (Personal Package Archive's) as additional repositories. Yet, I have also a few contentions. I don't like how software is only updated at 6 month intervals unless a security patch needs to be released. This is especially important with web browsers because they change so quickly and have so many security updates that the Ubuntu team just can't, or won't, keep up with. What if you want to use a feature in a new version of a program but it won't be updated for another few months? This is problematic for me because of the age of my video card but more on that in a moment. That leaves me with either manually upgrading them myself from source binaries or utilizing non-official repositories or .deb files. Either way, installing by those means often proves problematic during a system upgrade (were I to later do so) and negates the entire reason for using a system like Ubuntu anyway.
At the time of purchase, my nVidia GeForce 4 4800 Ti was top of the line for consumer video cards. Unfortunately, it has been considered a legacy card by nVidia for many years. Despite that they still update it to run with the latest Xorg server but it always takes a few months while they make sure their most recent cards are kept up to date first. Understandable. Ubuntu, though, has a long process to integrate fixes in their software distribution. So, while you may be able to install the nVidia Linux driver from scratch (which I've done) you have the additional issue of having to re-install the driver every time the kernel is updated. Installing the driver isn't as simple as just running the installer, either, because you can't install it while X is running so the process gets old fast. I could, also, just choose not to upgrade to new releases of Ubuntu but then my applications won't get updated and support eventually gets dropped altogether. It's a no-win situation.
Now that I've set the stage, let's move on to Arch.
So, I've begun the task of looking at other distributions which may be more friendly to my situation. Now, I have accepted the fact that I am always going to have issues with my graphics driver being out of date because of it's age. So, what I really want to overcome is keeping my applications up to date while I wait for the appropriate updates to the nVidia driver. Oh, and not have to monitor, hunt down, and manually upgrade each application myself. There's a reason I don't use Windows as my main desktop any more after all. My only real choice therefore is a rolling or semi-rolling distribution. The first distribution of that type that popped into my head was Arch.
Arch Linux has interested me for a while, namely because it started Canada where I was born and continue to live. It's not for the uninitiated though. It's installation process is a bit of a throw-back to the Linux days of old where you need to understand the entire process to install the system. Not that that scares me any more. I've built an LFS system (an experience I HIGHLY recommend to anyone with the guts to try it) at least three times now. It's not my first time to the rodeo. It more came down to whether I was too lazy to go through the process and maintain such a system or not.
Rather than install Arch on my main system I opted to run it on my backup/test Linux box. This second box is almost identical to my main one in most respects but has an ATI video card and smaller hard drives. I was testing out Fedora 15 on it, which is a good distribution in it's own right, but I decided to deep-six it (I still have the installation CD, I can re-install it if I choose to) and installed Arch. As I mentioned before, Arch is not for the weak. This is both a negative and a positive. On the negative side, it makes it very unapproachable to the common or new user. I've seen many complaints on forums and in articles about their installation procedure. However, you have to understand their counter argument to appreciate the approach they've taken. They're not trying to be another Debian, Fedora or Ubuntu. No, Arch Linux is aimed at someone more like me. Someone who wants a cutting edge system with a great deal of control (i.e. less software bloat). Once I installed Arch, I really got to appreciate what they, the Arch Developers, are trying to accomplish.
Installing Arch isn't actually that hard if you have a little patience. For one, it uses a text/ncurses based installer rather than a graphical one. While "ugly" when compared to a GUI it actually makes the process much, much faster. It also asks some tough questions that an absolute newbie may not know the answers to but help is near. It helps if you can run two PC's side by side, one with installation instructions on it and a second to install the OS onto. If you don't have two computers you may want to consider printing the instructions first. The installer also tries to recommend settings and offers some advice for people who just want the most basic setup and a lot of steps can be skipped. By in large, you can just select the default choices and you'll be fine. Arch also requires you edit some configuration files with an editor both during the installation process and post-installation. This will scare a lot of people off but again it's not so bad. The installation instructions on the website holds your hand through the process and tells you what to edit and why. Trust me, it's not that hard. It's like looking up a mountain and thinking you can never get to the top but once you get on the path you realize that it's really not so steep and can turn into a pleasurable experience.
Once the base system is installed you can reboot to run your new system. I already knew that I was only going to get a command prompt but those who are used to ready-out-of-the-box systems might be in for a little shock. Arch only sets up a basic, mostly bare-bones system with a command line interface. Therein lies the advantage of Arch. Similar to Ubuntu's core installation setup, you can pick and choose what exactly you want to install. Dependencies are resolved for you just like in other distributions that use package repositories. Arch lets you choose between installing helpful meta packages, which install a complete set of software such as a desktop environment, or you can choose the specific individual packages you need. You may, for example, want to install the X11 meta package but hand-pick a window manager, file manager, etc. to build your own desktop environment. Removing packages installed by a meta-package doesn't break the upgrade process either like it can with Ubuntu. Try and uninstall a program installed by a meta-package like ubuntu-desktop and Ubuntu will scare the pants of you by saying it's going to uninstall the ubuntu-desktop package too! It doesn't uninstall the whole system, of course, just the meta package but it scares a lot of newbies to Ubuntu, myself included.
I am still learning Arch but I like what I see so far. I have a lot to learn yet about pacman, Arch's aptly named package manager. Pacman seems to be faster overall than apt, too, likely because it doesn't use the Debian package system, just a compressed tarball. There is still a lot of tweaking left to be done too to get the system running exactly as I'd like. I love how easily things are configured. It's both a plus and a negative that newly installed software isn't automatically configured to run optimally with your system like Ubuntu but then it's not as necessary with Arch because there aren't a lot, if any, Ubuntu-only changes to other packages or systems you have to contend with. Settings in Arch are the same settings you'd use on any built-from-scratch or configured-from-scratch system so finding help is universal to Arch. Again, no Ubuntu-only solutions. That's not to say Arch doesn't automatically create the very minimal configuration settings to make sure the new application will run in a secure and efficient manner but only that you're encouraged to tweak to your hearts content.
My first impression is that Arch is the system for someone who wants a lot of control over their system mixed with the conveniences of a hands-off system like Ubuntu. Ubuntu could, if they wanted, be what Arch is I think but not with their current model. Yes, the Ubuntu team is talking about allowing rolling releases of specific pieces of software, like Firefox, but one or two applications being kept up-to-date doesn't really help me.
Arch could, potentially, be my own personal silver bullet. I can even block specific packages like the Kernel, nVidia/ATI drivers and Xorg from being updated until the time of my choosing while the rest of my software is continually updated. Nice! Now to see it in practice...
I run Linux on an older computer. Were I to have purchased it from a retail store I would likely call it "venerable" but since I built it from scratch as a gaming computer the old girl has a bit more kick. Irregardless, it's nine to ten years old and just can't handle modern gaming any more. Still, as a basic desktop machine, it works fantastic!
I have tried several Linux distributions over the years. I shudder to recall that my first distribution was Caldera Linux (see SCO to understand), which was soon followed by Mandrake Linux (now Mandriva) because of its easier customization. I was always in love with the idea of Linux but I struggled with the technical side of maintaining it, like upgrading applications from source. Installation, too, wasn't like it is today with distributions like Fedora and Ubuntu. You actually had to know something about your system. You usually needed to know every small detail about your sound card, if you could get it running at all. You needed to know every spec and detail about almost every piece of hardware in your system and that assumed it was compatible with Linux in the first place (Winmodems anyone?). It wasn't for the feint of heart and is likely how Linux got the reputation for being only for computer geeks.
I heard about Ubuntu early on but it's quirky release names initially turned me off. However, it's quick rise in popularity and numerous glowing reviews finally convinced me to give it a try so when it's fourth version was released, 6.06 Dapper Drake, I burned a copy and installed it. I loved it. Ubuntu altered my perception of Linux forever. I turned into a fanboy overnight and was suddenly encouraging anyone I could find to switch away from the diabolical Microsoft and finally live a life of Freedom! Even if I couldn't get them to switch from Windows I was at least pushing free software. What can I say? I was naive. My heart was in the right place, though.
Now, at version 11.04 I am starting to get frustrated with Ubuntu. I love all the software in their repositories. I love Launchpad and how you can set up PPA's (Personal Package Archive's) as additional repositories. Yet, I have also a few contentions. I don't like how software is only updated at 6 month intervals unless a security patch needs to be released. This is especially important with web browsers because they change so quickly and have so many security updates that the Ubuntu team just can't, or won't, keep up with. What if you want to use a feature in a new version of a program but it won't be updated for another few months? This is problematic for me because of the age of my video card but more on that in a moment. That leaves me with either manually upgrading them myself from source binaries or utilizing non-official repositories or .deb files. Either way, installing by those means often proves problematic during a system upgrade (were I to later do so) and negates the entire reason for using a system like Ubuntu anyway.
At the time of purchase, my nVidia GeForce 4 4800 Ti was top of the line for consumer video cards. Unfortunately, it has been considered a legacy card by nVidia for many years. Despite that they still update it to run with the latest Xorg server but it always takes a few months while they make sure their most recent cards are kept up to date first. Understandable. Ubuntu, though, has a long process to integrate fixes in their software distribution. So, while you may be able to install the nVidia Linux driver from scratch (which I've done) you have the additional issue of having to re-install the driver every time the kernel is updated. Installing the driver isn't as simple as just running the installer, either, because you can't install it while X is running so the process gets old fast. I could, also, just choose not to upgrade to new releases of Ubuntu but then my applications won't get updated and support eventually gets dropped altogether. It's a no-win situation.
Now that I've set the stage, let's move on to Arch.
So, I've begun the task of looking at other distributions which may be more friendly to my situation. Now, I have accepted the fact that I am always going to have issues with my graphics driver being out of date because of it's age. So, what I really want to overcome is keeping my applications up to date while I wait for the appropriate updates to the nVidia driver. Oh, and not have to monitor, hunt down, and manually upgrade each application myself. There's a reason I don't use Windows as my main desktop any more after all. My only real choice therefore is a rolling or semi-rolling distribution. The first distribution of that type that popped into my head was Arch.
Arch Linux has interested me for a while, namely because it started Canada where I was born and continue to live. It's not for the uninitiated though. It's installation process is a bit of a throw-back to the Linux days of old where you need to understand the entire process to install the system. Not that that scares me any more. I've built an LFS system (an experience I HIGHLY recommend to anyone with the guts to try it) at least three times now. It's not my first time to the rodeo. It more came down to whether I was too lazy to go through the process and maintain such a system or not.
Rather than install Arch on my main system I opted to run it on my backup/test Linux box. This second box is almost identical to my main one in most respects but has an ATI video card and smaller hard drives. I was testing out Fedora 15 on it, which is a good distribution in it's own right, but I decided to deep-six it (I still have the installation CD, I can re-install it if I choose to) and installed Arch. As I mentioned before, Arch is not for the weak. This is both a negative and a positive. On the negative side, it makes it very unapproachable to the common or new user. I've seen many complaints on forums and in articles about their installation procedure. However, you have to understand their counter argument to appreciate the approach they've taken. They're not trying to be another Debian, Fedora or Ubuntu. No, Arch Linux is aimed at someone more like me. Someone who wants a cutting edge system with a great deal of control (i.e. less software bloat). Once I installed Arch, I really got to appreciate what they, the Arch Developers, are trying to accomplish.
Installing Arch isn't actually that hard if you have a little patience. For one, it uses a text/ncurses based installer rather than a graphical one. While "ugly" when compared to a GUI it actually makes the process much, much faster. It also asks some tough questions that an absolute newbie may not know the answers to but help is near. It helps if you can run two PC's side by side, one with installation instructions on it and a second to install the OS onto. If you don't have two computers you may want to consider printing the instructions first. The installer also tries to recommend settings and offers some advice for people who just want the most basic setup and a lot of steps can be skipped. By in large, you can just select the default choices and you'll be fine. Arch also requires you edit some configuration files with an editor both during the installation process and post-installation. This will scare a lot of people off but again it's not so bad. The installation instructions on the website holds your hand through the process and tells you what to edit and why. Trust me, it's not that hard. It's like looking up a mountain and thinking you can never get to the top but once you get on the path you realize that it's really not so steep and can turn into a pleasurable experience.
Once the base system is installed you can reboot to run your new system. I already knew that I was only going to get a command prompt but those who are used to ready-out-of-the-box systems might be in for a little shock. Arch only sets up a basic, mostly bare-bones system with a command line interface. Therein lies the advantage of Arch. Similar to Ubuntu's core installation setup, you can pick and choose what exactly you want to install. Dependencies are resolved for you just like in other distributions that use package repositories. Arch lets you choose between installing helpful meta packages, which install a complete set of software such as a desktop environment, or you can choose the specific individual packages you need. You may, for example, want to install the X11 meta package but hand-pick a window manager, file manager, etc. to build your own desktop environment. Removing packages installed by a meta-package doesn't break the upgrade process either like it can with Ubuntu. Try and uninstall a program installed by a meta-package like ubuntu-desktop and Ubuntu will scare the pants of you by saying it's going to uninstall the ubuntu-desktop package too! It doesn't uninstall the whole system, of course, just the meta package but it scares a lot of newbies to Ubuntu, myself included.
I am still learning Arch but I like what I see so far. I have a lot to learn yet about pacman, Arch's aptly named package manager. Pacman seems to be faster overall than apt, too, likely because it doesn't use the Debian package system, just a compressed tarball. There is still a lot of tweaking left to be done too to get the system running exactly as I'd like. I love how easily things are configured. It's both a plus and a negative that newly installed software isn't automatically configured to run optimally with your system like Ubuntu but then it's not as necessary with Arch because there aren't a lot, if any, Ubuntu-only changes to other packages or systems you have to contend with. Settings in Arch are the same settings you'd use on any built-from-scratch or configured-from-scratch system so finding help is universal to Arch. Again, no Ubuntu-only solutions. That's not to say Arch doesn't automatically create the very minimal configuration settings to make sure the new application will run in a secure and efficient manner but only that you're encouraged to tweak to your hearts content.
My first impression is that Arch is the system for someone who wants a lot of control over their system mixed with the conveniences of a hands-off system like Ubuntu. Ubuntu could, if they wanted, be what Arch is I think but not with their current model. Yes, the Ubuntu team is talking about allowing rolling releases of specific pieces of software, like Firefox, but one or two applications being kept up-to-date doesn't really help me.
Arch could, potentially, be my own personal silver bullet. I can even block specific packages like the Kernel, nVidia/ATI drivers and Xorg from being updated until the time of my choosing while the rest of my software is continually updated. Nice! Now to see it in practice...
Wednesday, 24 August 2011
Compiling C Programs in Linux
A lot of beginner programmers, especially those new to the Linux world, have a lot of issues compiling programs. This isn't exclusive to C, of course, so hopefully you can take something away from this post and apply it other languages, too.
There are two primary stages to "compiling" a program. The first stage is the compilation stage. There are several intermediary steps but the end result is that your code, in this case C, is translated into a machine readable language. The second stage is called "linking". This is when each part of your program and any external libraries are linked together to create an actual executable binary file you run or a library file.
Disclaimer - This post may be hazardous to your health (not really). It is a non-exhaustive and non-authoritative introduction to compiling and linking programs on the command line in Linux. I can't emphasize enough that you should reference the documentation provided by the compiler you're using and that said documentation will always trump anything contained herein. Side effects may include: nausea, dry mouth, hair loss, severe depression and momentary blindness. If any of these side effects occur, discontinue use immediately and contact your physician.
Stage 1 - Compilation:
The Compiler:
First off, you have your choice of compiler. We are going to concern ourselves with the GNU C Compiler (gcc); currently, the most widely used C compiler for Linux. LLVM has been gaining ground but we won't be discussing it here though it stands to reason much of what you learn would still be applicable. The GNU C Compiler, usually simply referred to as GCC, is actually a collection of tools for compiling programming languages. Each tool has a unique name which reflects the language it is designed for. For example, compilers exist for the following languages in addition to C: C++ (g++), Java (gcj) and Fortran (gfortran). Only some, though possibly all, may be installed on your system by default.
Important: Do NOT use g++ to compile C language sources. There's a reason why they are separate compilers.
In the case of C, make sure you have a GCC installed on your system. Open a terminal and type:
gcc --version
If you get an error of some kind, chances are you don't have GCC installed and will need to do so before you can go any further.
Hint: For those of you running a Linux system like Arch, Debian, Fedora or Ubuntu you can install any of the tools discussed within this document quite easily. For example, Debian based systems usually have a 'build-essential' target via apt-get for installing commonly used tools for building programs from source. You can simply issue the command 'apt-get install build-essential' without the quotes. Check your specific distribution's instructions for installing these tools from their respective repositories.
Everything in this how-to should be run in a terminal. Make sure you change to the directory where your source files are contained and execute the supplied commands within than directory. There are many, many compiler flags to know but the following should be considered a minimum:
gcc -Wall -pedantic -std=c99 my_file.c -o my_program
...replacing "my_file.c" and "my_program" with the proper names of your file(s) and desired program name.
Compiling a source file with the above flags will produce a final binary, skipping the separate linking stage. However, as your programs get more complex it becomes advantageous to build intermediary objects first then link them together later.
Compiler Flags:
These flags are passed to the compiler to tell it what you want. There are some very important ones to know:
-std=c99 - This flag sets the specific standard for gcc to comply with. At the time of writing gcc defaults to a standard called gnu89, which is the c89 standard with GNU extensions. If you plan on writing a program which may be compiled on a system which does not use gcc as it's C compiler (LLVM, MS Visual Studio, Borland C, etc) then it is probably in your best interest to force the use the most current C standard and disable the GNU extensions. gcc is not 100% c99 compliant but the non-compliant features are rarely used. If you require any of the features of c99 which have not yet been implemented in gcc then you'll have to use another compiler anyway.
-o - Specify an output name for the object or binary. In the most basic case, you use it to specify the name of the program you are producing. If you don't use this flag, gcc will default to using the name of the source file and create a .o file for an intermediary object (myfile.c becomes myfile.o) or, in the case of creating a binary, it will default to a.out for the binary file.
Compiler Warnings:
Warnings should always be turned on. Here are some very important/common ones to know:
-Wall - all warnings; this is misleading because it doesn't actually turn on ALL warnings but just the most commonly desired ones.
-Wextra - turns on extra, more strict warnings.
-ansi - specifies that you want to adhere strictly to the C standard. It turns off all GNU extensions to the C language making it fully ANSI compliant. This is important for portability between compilers. This is automatically turned on when you specify the standards c89 or c99 with the -std flag. However, as stated earlier, gcc defaults to gnu89 (c89 with GNU extensions) and the -ansi flag will explicitly disable these extensions. -std=gnu89 -ansi is equivilent to -std=c89.
-pedantic - makes ANSI warnings fatal, meaning that they'll be reported as errors instead or warnings and halt compilation. It is usually a good idea to always enable this flag when you give -ansi or -std=c89/99.
Extra Compiler Flags:
-c - compile object code but do not link. This produces a file with the .o extension. Object files, those ending with .o, are later linked together to create a final binary or library.
-I <directory> - This flag allows you to specify an additional location for the compiler to find your header (.h) files by replacing <directory> with the location of the headers being searched for. This is useful if you store your header files in a directory other than the one your regular source (.c) files are located. You can chain as many of these flags together to add as many directories as you need. If you don't know why you would need to use this then it's safe to just leave it out.
Stage 2 - Linking:
The Linker:
As noted earlier, building a program is a two step process. The second step, after compiling, is linking. In most, if not all, large projects sources are usually compiled first into object files. On their own, they do nothing and are essentially useless. In order to work, they need to be linked together to create a single binary file, an actual program, then made executable. This is where the linker, named 'ld', comes in. The linker has its own set of flags which may either be passed to gcc or to ld itself. It is probably easier, and in your best interest, to issue the flags to gcc for simplicity's sake.
Linker Flags:
-l<library> - where library is the name of the external library you need to link into your program, sans (minus) the 'lib' prefix. In other words, if you with to link the math library, libmath into your program, you drop the 'lib' part and use: -lmath. Actually, the linker can find libraries with a basic regex so you can link -lm for the math library.
Extra Linker Flags:
-static - Used to force libraries linked to your program statically. This means that the library itself is incorporated (bolted on) into your program. This increases the size of your program but does away with some of the issues associated with dynamic linking. A discussion on dynamic vs. static linking is WAY beyond the scope of this article.
-dynamic - Force libraries to be dynamically linked to your program. This means that the library is linked to your program but not actually loaded until the program is run.
-L <directory> - Specify a path to find extra/custom libraries by replacing <directory> with the location of the libraries being searched for. This can be used to link external libraries not in installed in the normal paths searched by the linker. It is also used to specify directories within your project if you are linking to internal libraries.
The Next Steps:
For compiling very small programs it is usually easiest just to run gcc from the command line. As programs get larger, though, it usually becomes necessary to remove some of the complexity. The next step would be to create a custom Makefile to build your program. You may then simply issue a single command, make, and the rest of the work is done for you. I mentioned earlier that it becomes simpler to compile sources into objects first then link them later. That is because as a project gets larger the longer it takes to compile and link. By utilizing a Makefile only files which have been modified are recompiled and then re-linking thereby speeding up the build process when changes are being made. This is especially helpful when debugging.
The next step would probably to use a full build system like the GNU autotools. Autotools is a general term for a suite of programs: aclocal, autoconf, autoheader, automake, autopoint and libtool. GNU autotools provides a method of making your code more portable and easier to distribute. They can even roll a tarball for you and compress it. By utilizing tools like GNU gettext and GNOME's intltool you can integrate translations into your project, too. The autotools are the backbone of other distribution methods, as well, like creating an .rpm or .deb in Linux and knowing how to use them tends to be an essential skill. There is plenty of help on the Internet on using GNU's autotools. Just do a web search and you'll find plenty of help.
If you're having trouble, read: How to Get Help
There are two primary stages to "compiling" a program. The first stage is the compilation stage. There are several intermediary steps but the end result is that your code, in this case C, is translated into a machine readable language. The second stage is called "linking". This is when each part of your program and any external libraries are linked together to create an actual executable binary file you run or a library file.
Disclaimer - This post may be hazardous to your health (not really). It is a non-exhaustive and non-authoritative introduction to compiling and linking programs on the command line in Linux. I can't emphasize enough that you should reference the documentation provided by the compiler you're using and that said documentation will always trump anything contained herein. Side effects may include: nausea, dry mouth, hair loss, severe depression and momentary blindness. If any of these side effects occur, discontinue use immediately and contact your physician.
Stage 1 - Compilation:
The Compiler:
First off, you have your choice of compiler. We are going to concern ourselves with the GNU C Compiler (gcc); currently, the most widely used C compiler for Linux. LLVM has been gaining ground but we won't be discussing it here though it stands to reason much of what you learn would still be applicable. The GNU C Compiler, usually simply referred to as GCC, is actually a collection of tools for compiling programming languages. Each tool has a unique name which reflects the language it is designed for. For example, compilers exist for the following languages in addition to C: C++ (g++), Java (gcj) and Fortran (gfortran). Only some, though possibly all, may be installed on your system by default.
Important: Do NOT use g++ to compile C language sources. There's a reason why they are separate compilers.
In the case of C, make sure you have a GCC installed on your system. Open a terminal and type:
gcc --version
If you get an error of some kind, chances are you don't have GCC installed and will need to do so before you can go any further.
Hint: For those of you running a Linux system like Arch, Debian, Fedora or Ubuntu you can install any of the tools discussed within this document quite easily. For example, Debian based systems usually have a 'build-essential' target via apt-get for installing commonly used tools for building programs from source. You can simply issue the command 'apt-get install build-essential' without the quotes. Check your specific distribution's instructions for installing these tools from their respective repositories.
Everything in this how-to should be run in a terminal. Make sure you change to the directory where your source files are contained and execute the supplied commands within than directory. There are many, many compiler flags to know but the following should be considered a minimum:
gcc -Wall -pedantic -std=c99 my_file.c -o my_program
...replacing "my_file.c" and "my_program" with the proper names of your file(s) and desired program name.
Compiling a source file with the above flags will produce a final binary, skipping the separate linking stage. However, as your programs get more complex it becomes advantageous to build intermediary objects first then link them together later.
Compiler Flags:
These flags are passed to the compiler to tell it what you want. There are some very important ones to know:
-std=c99 - This flag sets the specific standard for gcc to comply with. At the time of writing gcc defaults to a standard called gnu89, which is the c89 standard with GNU extensions. If you plan on writing a program which may be compiled on a system which does not use gcc as it's C compiler (LLVM, MS Visual Studio, Borland C, etc) then it is probably in your best interest to force the use the most current C standard and disable the GNU extensions. gcc is not 100% c99 compliant but the non-compliant features are rarely used. If you require any of the features of c99 which have not yet been implemented in gcc then you'll have to use another compiler anyway.
-o - Specify an output name for the object or binary. In the most basic case, you use it to specify the name of the program you are producing. If you don't use this flag, gcc will default to using the name of the source file and create a .o file for an intermediary object (myfile.c becomes myfile.o) or, in the case of creating a binary, it will default to a.out for the binary file.
Compiler Warnings:
Warnings should always be turned on. Here are some very important/common ones to know:
-Wall - all warnings; this is misleading because it doesn't actually turn on ALL warnings but just the most commonly desired ones.
-Wextra - turns on extra, more strict warnings.
-ansi - specifies that you want to adhere strictly to the C standard. It turns off all GNU extensions to the C language making it fully ANSI compliant. This is important for portability between compilers. This is automatically turned on when you specify the standards c89 or c99 with the -std flag. However, as stated earlier, gcc defaults to gnu89 (c89 with GNU extensions) and the -ansi flag will explicitly disable these extensions. -std=gnu89 -ansi is equivilent to -std=c89.
-pedantic - makes ANSI warnings fatal, meaning that they'll be reported as errors instead or warnings and halt compilation. It is usually a good idea to always enable this flag when you give -ansi or -std=c89/99.
Extra Compiler Flags:
-c - compile object code but do not link. This produces a file with the .o extension. Object files, those ending with .o, are later linked together to create a final binary or library.
-I <directory> - This flag allows you to specify an additional location for the compiler to find your header (.h) files by replacing <directory> with the location of the headers being searched for. This is useful if you store your header files in a directory other than the one your regular source (.c) files are located. You can chain as many of these flags together to add as many directories as you need. If you don't know why you would need to use this then it's safe to just leave it out.
Stage 2 - Linking:
The Linker:
As noted earlier, building a program is a two step process. The second step, after compiling, is linking. In most, if not all, large projects sources are usually compiled first into object files. On their own, they do nothing and are essentially useless. In order to work, they need to be linked together to create a single binary file, an actual program, then made executable. This is where the linker, named 'ld', comes in. The linker has its own set of flags which may either be passed to gcc or to ld itself. It is probably easier, and in your best interest, to issue the flags to gcc for simplicity's sake.
Linker Flags:
-l<library> - where library is the name of the external library you need to link into your program, sans (minus) the 'lib' prefix. In other words, if you with to link the math library, libmath into your program, you drop the 'lib' part and use: -lmath. Actually, the linker can find libraries with a basic regex so you can link -lm for the math library.
Extra Linker Flags:
-static - Used to force libraries linked to your program statically. This means that the library itself is incorporated (bolted on) into your program. This increases the size of your program but does away with some of the issues associated with dynamic linking. A discussion on dynamic vs. static linking is WAY beyond the scope of this article.
-dynamic - Force libraries to be dynamically linked to your program. This means that the library is linked to your program but not actually loaded until the program is run.
-L <directory> - Specify a path to find extra/custom libraries by replacing <directory> with the location of the libraries being searched for. This can be used to link external libraries not in installed in the normal paths searched by the linker. It is also used to specify directories within your project if you are linking to internal libraries.
The Next Steps:
For compiling very small programs it is usually easiest just to run gcc from the command line. As programs get larger, though, it usually becomes necessary to remove some of the complexity. The next step would be to create a custom Makefile to build your program. You may then simply issue a single command, make, and the rest of the work is done for you. I mentioned earlier that it becomes simpler to compile sources into objects first then link them later. That is because as a project gets larger the longer it takes to compile and link. By utilizing a Makefile only files which have been modified are recompiled and then re-linking thereby speeding up the build process when changes are being made. This is especially helpful when debugging.
The next step would probably to use a full build system like the GNU autotools. Autotools is a general term for a suite of programs: aclocal, autoconf, autoheader, automake, autopoint and libtool. GNU autotools provides a method of making your code more portable and easier to distribute. They can even roll a tarball for you and compress it. By utilizing tools like GNU gettext and GNOME's intltool you can integrate translations into your project, too. The autotools are the backbone of other distribution methods, as well, like creating an .rpm or .deb in Linux and knowing how to use them tends to be an essential skill. There is plenty of help on the Internet on using GNU's autotools. Just do a web search and you'll find plenty of help.
If you're having trouble, read: How to Get Help
Subscribe to:
Posts (Atom)