This one's specifically for Dumpster Diver.
Source: LifeHacker
What operating system do you use? For some, that question may as well be posed in Latin or Sanskrit. For others, it’s an invitation to have a heated debate about the benefits of GUI vs. command line, modern day UI vs. old school metaphor, the pros/cons of Windows 10, LAMP vs. IIS … the list goes on and on. For most, however, the answer will be a variation on Windows or Mac.
But anyone that has used Windows (in any of its incarnations) long enough knows, at some point, frustration will rule the day, and you’ll be working along and, seemingly out of nowhere, Windows will decide to apply updates and restart, putting your work at risk while you go through the lengthy process of applying updates and rebooting. Or what about the inevitable virus or malware? You spend precious dollars on antivirus software or, worst case scenario, you have to send the machine to your local PC repair to get the virus removed. All the while, work is not being done. While Apple’s macOS products suffer less from the vulnerabilities found in the Windows platform, they also come with a fairly hefty price tag.
There is, however, another alternative to both that doesn’t cost any money to download and install, and is far more immune to viruses and malware. That operating system is Linux. What is Linux? Let’s take a look.
So what exactly is it?
Linux came about in the mid-1990s, when then-student Linus Torvalds was tasked with creating a disk driver so he could read the Minix file system. (Minix is a POSIX-compliant, UNIX-like operating system that saw its first release in 1987.) That project eventually gave birth to what would come to be known as the Linux kernel. The kernel of an operating system is an essential core that provides basic services for all aspects of the operating system. In the case of Linux, the kernel is a monolithic, UNIX-like system which also happens to be the largest open source project in the world. In the most basic terms one could say, “Linux is a free alternative to Microsoft Windows and macOS.”
Linux is a ‘can do’ platform
For those that are concerned about getting their work done with Linux, let’s take into consideration how the average user works with a computer and how Linux can meet those needs. For the average user, a computer is a means to:
- Interact on social media
- Read email
- Listen to music
- Watch Youtube or Netflix
- Occasionally write something
Five years ago, each of those tasks would have been handled via a different application. Now, not so much. Modern computing tasks are most often relegated to a browser. Facebook, Google Docs, Netflix, Outlook 365… they’re all used within the likes of Chrome, Firefox, Safari, or Internet Explorer. Each one of those browsers does a good job of enabling the user to do their thing. It’s only on very rare occasions that a user will land on a site that will only work with one of the above browsers.
So considering that the average user spends most of their time within a browser, the underlying platform has become less and less relevant. However, with that in mind, wouldn’t it make sense to use a platform that doesn’t suffer from the usual distractions, vulnerabilities, and weaknesses that plague the likes of Windows? That’s where Linux shines. And with Linux being open source, users are not only able to use the platform for free, they can also alter and re-distribute the operating system as their own distribution.
Linux lets you customize and share
There are basically two different types of software: Proprietary and open source. With proprietary software, the code used to create the application (or operating system) is not available for public usage or viewing. Open source, on the other hand, makes the code used to create the software freely available. While the average user might not be concerned with the option to make alterations to their OS, this functionality of Linux helps to explain why this operating system doesn’t cost you anything. Linux is an open source platform, meaning that the code is available for anyone to download, change, and even redistribute. Because of this, you could download the source code for the various elements that make up a Linux distribution, change them, and create your very own distribution.
And as for that distribution, this is very often a point of confusion with new users. As mentioned above, Linux is really just the kernel of the operating system. In order to actually use it, there are layers that must be added to make it functional. The layers include things like:
- Device drivers
- Shell
- Compiler
- Applications
- Commands
- Utilities
- Daemons
Developers will sometimes adapt those layers, to achieve a different functionality, or swap out one system for another. In the end, the developers create a unique version of Linux, called a distribution. Popular Linux distributions include:
There are (quite literally) thousands of Linux distributions available. To see a listing of which distributions of Linux are popular, take a look at Distrowatch.
Getting to know a different kind of desktop
One of the biggest variations you will find between the different Linux distributions is the desktop environment. Most users know what both Windows and Mac desktops look like. You might be surprised to find there are some Linux desktops that look and behave in a very familiar fashion. Others, however, offer a rather unique look and feel. Take, for instance, the GNOME desktop (pictured below). This very modern user interface does a great job of ensuring desktop elements are rarely (if ever) in the way, so that interaction with applications takes focus. It’s a minimal desktop that delivers maximum efficiency.
The GNOME desktop as seen on openSUSE, showing the activities window.
But what exactly is the desktop? In very basic terms, the desktop is comprised of pieces like the Apple menu, applications menu, menu bar, status menu, notification center, clickable icons, and some form of panel (or dock). With this combination of elements, the desktop makes it very easy for the user to interact with their computer. Every desktop contains a mixture of these parts. Linux is no exception. With the aforementioned GNOME, you have the GNOME Dash (which is like the application menu), the top bar (which is like the Apple menu bar), a notification center, and can even (through the use of extensions) add a customizable dock. Without a desktop environment, you would be relegated to the command line; trust me, you don’t want that.
The most popular Linux desktop environments are:
There are a number of other desktop options, but the above tend to be considered not only the more popular, but user friendly and reliable. When looking into desktops, you’ll want to consider your needs. For example, the KDE desktop does a great job of functioning like Windows 7. Cinnamon and Mate are similar, but less modern looking. Xfce is a very lightweight desktop, so if you have slower hardware, it makes for a great solution. And again, GNOME is a minimalist dream, with very little getting in your way of working.
The desktop environment is also where you an interact with applications … which brings us to our most important issue.
Are the application options any better?
This is one area that has been, in the past, a point of contention for Linux. If you ask any dyed in the wool Windows fan/user, they will tell you, just like with macOS, you cannot run Windows applications on Linux. But that’s not necessarily true. Thanks to a compatibility layer, called Wine (which used to stand for Wine Is Not an Emulator), many windows applications can be run on Linux. This is not a perfect system, and it’s not for everyone. But it does enable users to run many Windows applications on Linux.
Even without native Windows applications, Linux still has you covered with the likes of:
- LibreOffice — a full-blown office suite (think MS Office)
- Firefox/Chromium/Chrome — fully functional web browsers (think Safari or Internet Explorer)
- The GIMP — a powerful image editing tool (think Photoshop)
- Audacity — a user-friendly audio recording tool
- Evolution — a groupware suite (think Outlook)
Linux has tens of thousands of free applications, ready to install. Even better, most modern distributions include their own app stores (such as GNOME Software or the Elementary OS AppCenter) that make installing software incredibly easy. Nearly all modern Linux distribution’s app stores can be found within the desktop menu. Once you’ve opened your app store, look for applications like LibreOffice (which is probably installed by default), The GIMP (a powerful image editing tool), Audacity (a user-friendly audio recorder that’s great for recording podcasts), Thunderbird (email client), VLC (multimedia player), or Evolution (groupware suite), to name just a few.
Is Linux for me, and how do I start?
Linux is ready to open up a world of free (and open) software that is reliable, secure, and easy to use. Is it perfect? No. If you happen to depend upon a proprietary piece of software, you might find that Linux (even with the help of Wine) cannot install that application you need. The big question on your mind might be, “How do I find out if Linux will work for me?” Believe it or not, Linux has that covered as well. Most flavors of Linux are distributed as “Live Distributions.”
What that means is you can download the distribution ISO image, burn that image onto either a CD/DVD or USB flash drive, insert the media into your computer (either in the CD/DVD drive or USB port) and boot from that media. Instead of installing the operating system, the Live Distributions run directly from RAM, so they don’t make any changes to your hard drive. Use Linux in that way and you’ll know, pretty quickly, if it’s an operating system that can fulfill your needs. Unlike the early years, you don’t have to be a computer geek to get up to speed on most of the readily available Linux distributions. To find out more about Linux distributions, head over to Distrowatch, where you can download and read about nearly every available Linux distribution on the planet.
Source: LifeHacker
As someone who has himself been exclusively using GNU/Linux for almost 18 years now, I must add a few comments to the above article.
First of all, the author is not correct on account of how Linux came to be. I have already expounded on the history of GNU/Linux in episode XIX of the Voices of the Forums video conferences (formerly known as TOTcasts). But I'll reiterate the story here-below.
Free & Open Source Software has already been around for quite a long time, but up until the early 1980s, this was mainly in the form of individual software applications, not yet as a complete operating system.
At that point in time, there was a guy working in the artificial intelligence labs of the Massachusetts Institute of Technology (MIT) named Richard M. Stallman, or just "RMS" for friends. Stallman was frustrated over the fact that he could not share the code he himself had written with his friends, because all of that code was the so-called "intellectual property" of his employer, MIT. This is why he devised the idea of creating an entire operating system as Free & Open Source Software, the accent in his case laying with the "free as in freedom" aspect.
Stallman realized that if he were to create such an operating system, it would have to be a general purpose operating system that everyone could use, and so he chose a UNIX-family architecture, as UNIX was a time-proven robust, secure, scalable, flexible and portable platform. However, there was going to be one big difference: the new operating system was going to beget a very advanced microkernel design — I'll get back to this in a minute.
And so, in 1983, Richard Stallman announced that he was going to create a completely free (as in freedom) UNIX-like operating system, called GNU — a recursive acronym for "Gnu is Not Unix", which additionally, when pronounced by native English speakers and thus with a muted initial "G", sounds very much like "new".
By 1984, Stallman had created the Free Software Foundation, and most of the GNU system had been written, which had been possible because GNU is a UNIX-like system — it fully mimics UNIX's behavior — and UNIX is a highly modular architecture with interchangeable components. However, due to Stallman's choice for a microkernel system, writing the kernel itself proved far more difficult than anticipated.
The kernel is the highest-privilege part of any operating system. It runs in the processor's highest-privilege ring, and it communicates directly with the hardware. Many different types of kernel exist, but the three best known designs are the microkernel, the monolithic kernel and the hybrid kernel.
In a monolithic kernel, all of the low-level system code — including all the hardware drivers — runs with the same privileges, and in the same memory address space, called kernelspace. This makes a monolithic kernel very fast, but it also introduces potential stability issues, because if one component of that very elaborate code base contains a serious enough bug, then it can hang the entire machine.
A microkernel solves this problem by having everything that doesn't strictly have to be inside the kernel run in the lowest-privilege ring of the processor instead, which means that it also runs in multiple separate and isolated memory address spaces, which all together are commonly referred to as userspace. As such, if one of those userspace components has a serious enough bug in it which could crash that component, then it won't take the entire kernel (and thus the entire machine) down with it, and then a supervisor process can automatically restart the crashed code if needed.
At first glance, that would seem like a very good design, but the problem is that this entails an enormous complexity on account of the communication between the actual kernel and its userspace components, and even more importantly, between the individual userspace components themselves, because they run completely isolated from one another, and the only way they would be able to communicate with one another would be through some sort of inter-process message bus.
Now, the GNU developers wanted to use the Mach microkernel as the basis for the native GNU kernel — called HURD — but Mach contained copyrighted code and patents, and so the GNU developers had to work around that. As a result, and with the added complexity of a microkernel design, the HURD never really got off the ground in terms of development. (A preliminary version of it does exist for testing purposes, but it's not usable for production machines just yet.)
Although frustrated over the slow progress of its development, Stallman considered the slow development of the HURD less of a problem, because the rest of the GNU system was ready, and was praised by many IT professionals over its code quality and usability. Given the modular nature of a UNIX-family operating system, all of GNU's userland could in essence be used on top of the kernel of another UNIX operating system without too many problems.
Then, in 1991, a Finnish computer sciences student (from a Swedish-speaking minority) at the University of Helsinki named Linus Benedict Torvalds was using the Minix operating system on his own Hewlett Packard Vectra computer, which had a 32-bit Intel i80386 microprocessor in it. Minix was a slim microkernel-based UNIX-like operating system written by Andrew Tanenbaum for educational purposes.
However, in those days, Minix was still a 16-bit operating system, written for the Intel i8086 processor, and while the i80386 could run that code, it would have to do so by emulating an i8086, which meant that one could not access any more than 640 KiB of main memory, and that the processor would also not offer any provisions in hardware for managing code privilege separation and memory protection.
Linus Torvalds was frustrated with this limitation, and he wanted to explore the true power of his 32-bit i80386 processor. However, and again in those days, the Minix license did not permit anyone to modify the code. And that is how Linus eventually decided to start writing his own 32-bit kernel. And as the GNU userland code was freely available and their license permitted modification of the code, he began porting small parts of GNU to his own kernel — which he initially called Freax (for "Free UNIX"), but the administrator of the server upon which Linus was storing his source code files found that such an off-putting name that he changed the name of the directory — that which Microsoft and Apple users call "a folder" — to "Linux".
Somewhere in August 1991, Linus Torvalds announced his kernel in the comp.os.minix newsgroup on Usenet, and he was offering up the code for free, so long as it wasn't going to be used commercially by anyone. In the same vein, he also asked whether anyone would be interested in helping him out, and the GNU developers, who were also monitoring that newsgroup, immediately started working together with Linus into porting the whole of GNU to the Linux kernel. Unlike the HURD, Linux is a monolithic kernel, not a microkernel, but due to Linus maintaining very strict coding standards, it quickly proved of very high quality.
Shortly after that, Linus attended a symposium where Richard Stallman gave a lecture about Free Software and the GNU General Public License, and that's when Linus Torvalds decided to license Linux under the GPL. And that, then, made the GNU developers — Stallman included — decide to put the HURD on the back burner. After all, the Free Software Foundation's objective was to offer a complete operating system that was licensed under the GPL — and that would thus be "free as in freedom" — and with the Linux kernel being GPL-licensed and working together with the GNU userland in the GNU/Linux operating system, that objective had now been attained.
Another aspect of the article I wanted to touch upon is that the author seems to favor GNOME as his graphical user interface of choice. Most GNU/Linux distributions nowadays come with only one main graphical user interface installed by default, but one can easily install several others alongside of that, and then each user — UNIX operating systems are multiuser — can choose the user interface of their preference at login time. In fact, that is how it always used to be, until Canonical started the practice of offering only one user interface on its Ubuntu installation media.
The author also mentions KDE, but that nomenclature is already outdated now. The name KDE now stands for the collective of software developers behind the project, and as of version 4 onward, the desktop environment is now called Plasma. Plasma is currently in its 5th generation already, albeit that many feel that the development is going too fast for the project to truly mature and stabilize.
I also disagree with the author's claim that KDE/Plasma "does a good job of mimicking Windows 7", because one can make Plasma look like whatever one wants. As the matter of fact, I am still using Plasma 4 here on my own system — because of the stability/maturity issues with Plasma 5 cited above — and I have applied a whole set of customizations, which make my desktop actually look more like that of an Apple MacIntosh than like Microsoft Windows. In fact, the only thing it has in common with Microsoft Windows is that it has a task bar sitting at the bottom of the screen, but even that is something I can change, and I could easily replace that with an Apple-like dock if I wanted to — but I don't.
For anyone interested in GNU/Linux, further information can be found at the following links...