Who is the creator of the linux system. History of Linux development. What do Linux users tell us?

In popular usage, "Linux" often refers to a group of distributed operating systems built on top of the Linux kernel. In the strictest sense though, Linux only refers to having a kernel itself. In order to install a complete operating system, distributions often include a set of tools and libraries from the GNU Project and other sources. Other developers have recently been using Linux to build and run mobile apps; it also plays a key role in the development of affordable devices such as Chromebooks that run an operating system on a core. In cloud computing and server environments in general, is a popular choice for several practical reasons:
  • Its distributions remain current and supported by communities of other developers.
  • It can operate on a wide range of hardware and can be installed alongside existing systems (a useful feature of local development conditions).
  • It supports centralized installation of software from pre-existing repositories.
  • His need for resources is low.
  • This is often the pinnacle of mind when developers build application ecosystems and server rigs, resulting in a high level of interoperability.
  • It supports the necessary changes in the behavior of the operating system.

Linux also traces its origins to the open source movement, and as a result, some developers choose it for a combination of ethical and practical considerations:

  • For some developers, using Linux represents a commitment to accessibility and freedom of expression.
  • The Linux community is also attractive for some developers: when they have questions, they can refer to the resources provided by this community or go directly to one of the many active maintainers.

In order to understand the role of Linux within the developer community (and beyond), this article will outline a brief history of Linux the way Unix is, and discuss some of the popular Linux distributions.

Roots in Unix

Linux has its roots in Unix and Multics, two projects with the common goal of creating a robust multi-user operating system.

Beginnings of Unix

Unix is ​​developed from the Multics project at the Computer Science Research Center of The Bell Laboratories. Developers working on Multics at Bell Labs and elsewhere were interested in creating a multi-user operating system with single-level storage, dynamic linking (in which a running process can request that another segment be added to the address space, allowing it to execute that segment's code) and a hierarchical file system .

Bell Labs stopped funding the Multics project in 1969, but a group of researchers, including Ken Thompson and Dennis Ritchie, continued to work on the basic principles of the project. In 1972-3 they made the decision to rewrite the system in C, which made Unix uniquely portable: unlike other modern operating systems, it could both roam and outlive its hardware.

Research and development at Bell Labs (later AT&T), a continuation with Unix System Laboratories, is developing a version of Unix, in collaboration with Sun Microsystems, that will be widely adopted by commercial Unix vendors. At the same time, research continued in academia, most notably the Computer Systems Research Group at the University of California at Berkeley. This group produced the Berkeley Software Distribution (BSD), which inspired a number of operating systems, many of which are still in use today. Two BSD distributions for historical reference are NeXTStep, the operating system started with NeXT, which became the basis for MacOS, among other products, and MINIX, the educational operating system that formed the basis for Linus Torvalds, and developed Linux from this basis. .

Basic Unix Features

Unix is ​​oriented around the principles of clarity, portability, and concurrency.

  • Clarity: Unix's modular design allows functions to be run in a limited and defined way. Its file system is unified and hierarchical, which simplifies data handling. Unlike some of its predecessors, Unix implements hundreds (not thousands) of system calls, each for a direct and clear purpose.
  • Portability: Having written Unix in C, the group at Bell Labs positioned Unix for widespread use and adoption. C was designed to have low-level memory access, minimal runtime latency, and efficient communication between the language and machine instructions. The C basis makes Unix more flexible and easy to run on a variety of hardware.
  • Simultaneity: The Unix kernel is adapted to the goal (with the Multics project) of supporting multiple users and worker processes. Kernel space remains distinct from Unix user space, allowing multiple applications to run at the same time.

Linux evolution

Unix raises important questions for developers, but it also remained proprietary in its early iterations. The next chapter of his story is how the developers worked within and against it to create free and open source alternatives.

Open source experiments

Richard Stallman was a central figure among the developers who were inspired to create non-proprietary alternatives to Unix. While working at MIT's Artificial Intelligence Laboratory, he began work on the GNU project (recursive for "GNU is not Unix!"), eventually leaving the lab in 1984 so that he could distribute GNU components as free software. The GNU kernel, known as the GNU Hurd, became the focus of the Free Software Foundation (FSF), founded in 1985 and currently led by Stallman.

Meanwhile, another developer has developed another free alternative to Unix: Finnish student Linus Torvalds. After becoming frustrated with MINIX for a license, Torvalds announced the MINIX user group on August 25, 1991, he began developing his own operating system that resembled MINIX. Although originally developed on MINIX using the GNU C compiler, the Linux kernel quickly became a unique project with kernel developers releasing version 1.0 of the kernel with Torvalds in 1994.

Torvalds was the executor of the GNU code, including the GNU C Compiler, with its kernel, and it remains true that many Linux distributions rely on GNU components. Stallman has lobbied to expand the term "Linux" into "GNU/Linux", which he claims will capture both the GNU project's role in the development of the Linux system and the underlying ideals that the GNU project and the Linux kernel have contributed to. Today "Linux" is often used to refer to both having a Linux kernel and GNU elements. At the same time, embedded systems on many handheld devices and smartphones often use the Linux kernel with a few GNU components.

Key features of Linux

While the Linux kernel inherits many of the goals and properties from Unix, it differs from the previous system in the following ways:

  • Its main kernel component, which is developed independently from other operating system components. This means that Linux borrows elements from various sources (eg GNU) that unifies the entire operating system.
  • It's free and open source. Supported by the developer community, the core is licensed under the GNU General Public License (an offshoot of FSF's work on the GNU Project), and available for download and modification. The GPL stipulates that a derivative work must maintain the license terms of the original software.
  • It has a Unix-like monolithic kernel, but it can dynamically load and unload kernel code on demand.
  • It has symmetric multiprocessing (SMP) support, unlike traditional Unix implementations. This means that one operating system can have access to multiple processors that share main memory and access to all I/O devices.
  • The kernel is proactive, another difference from Unix. This means that the scheduler can force a switch to a driver or another part of the kernel at run time.
  • The kernel does not distinguish between threads and normal processes.
  • Includes a command line interface (CLI) and may also include a graphical user interface (GUI).

Popular Linux distributions

Developers today support many popular Linux distributions. Among the oldest is , free and open source, which has 50,000 software packages. Debian inspired another popular distribution funded by Canonical Ltd. Ubuntu uses the deb package format and Debian package management tools.

A similar dependency exists between Red Hat, Fedora, and . Red Hat created the distribution in 1993 and a decade later split its efforts into Red Hat Enterprise Linux and Fedora, an operating system-based community that uses the Linux kernel and elements from the GNU project. Red Hat is also involved with the CentOS project, another popular Linux distribution for web servers. This ratio, however, does not include paid maintenance. Debian, CentOS is maintained by the developer community.

Conclusion

In this article, we've looked at Linux roots on Unix and some of their characteristics. All comments can be left below in the comments.

In 1991, as today, computers were classified according to their size and capabilities. Computers can belong to any category, ranging from desktop personal computers (PCs) to supercomputers. The x86-based computers, the direct predecessors of today's PCs, dominated the personal computer market in 1991. However, other types of computers were available at the time, including Macs. Such computers, as a rule, used other processors and ran their own OS.

History of Linux

In 1991, most computers ran Microsoft's Disk Operating System (MS-DOS, PC-DOS, or DOS). By today's standards, the DOS system was extremely limited. This single-tasking OS (capable of running only one application at a time) could not even take full advantage of the available memory or processor resources. The versions of Microsoft Windows that were available in 1991 ran on top of the DOS system. While the early versions of Windows helped get around some of the limitations of DOS, they didn't completely solve any problem. For example, early versions of Windows used cooperative multitasking—programs could voluntarily allocate CPU resources to run other processes. The DOS kernel could not take control of a program that was consuming processor time.

Unix was the main operating system in 1991. Compared to DOS and the version of Windows at the time, Unix was a fairly complex system. Unix supported multiple accounts and provided true preemptive multitasking, in which the kernel could control the processor resources allocated to programs even if the programs did not voluntarily relinquish control. These features have been practical needs for many servers and multi-user computers such as minicomputers and mainframes.

Unix wasn't the only multi-user and multi-tasking OS in 1991. A Virtual Memory System (VMS) was available. Nevertheless, Unix is ​​most directly related to the history of Linux.

Over time, the capabilities of each class of computers have increased. By most measures, today's personal computers are about as powerful as minicomputers or even mainframes were in 1991. The operating systems used on PCs in 1991 did not scale well to more powerful hardware. However, greater processing power alone did not remove the limitations inherent in the DOS system.

For this reason, DOS and its contemporaries, designed for smaller computers, were replaced by the Unix system and other alternatives.

Modern versions of Windows are not derived from DOS. Instead, they use a new kernel that has a lot in common in terms of design with the VMS system.

In 1991, Linus Torvalds studied computer science at the University of Helsinki. He was interested in Unix and the capabilities of the new x86 computer he had just bought. Torvalds began developing the program that would become the Linux kernel as a low-level terminal program emulator to connect to larger university computers. As his program evolved, he added new features to it that turned his terminal program into something more like an OS kernel. After all, he set himself the goal of creating a Unix-compatible kernel, that is, a kernel that could run the wide variety of Unix programs available at the time.

Linus Torvalds

The history of Unix began two decades earlier, in 1969 at AT&T. Since AT&T was the telephone monopolist in the United States at the time, it was not allowed to sell the software. Thus, by creating Unix, the AT&T people actually gave it away. Unix enthusiastically embraced Unix, and some even began modifying the system because AT&T made the source code available. Thus, in the history of Unix there was a 20-year period of development of open source software. Most Unix programs were distributed as source code because Unix ran on a wide variety of hardware platforms—binary programs built for one machine could rarely run on another machine.

At an early stage, Linux began to exploit the potential of existing software. Developers of early versions of Linux were especially interested in the software of the GNU project, so the operating system quickly acquired a collection of related utilities. Most of these programs were created with workstations and more powerful computers in mind, and because of the continued improvement in computer hardware, they worked well on x86 computers of the early 1990s.

In the early 1990s, 386BSD was a competing Unix-like operating system. Today it is divided into several related operating systems: FreeBSD, NetBSD, OpenBSD, DragonFly BSD, and PC-BSD.

Linux quickly gained dedicated developers who appreciated its potential to bring workstation-class software to the PC. These people worked to improve the Linux kernel to make the necessary changes to existing Unix programs to run on Linux, and to create Linux-specific support programs. By the mid-1990s, there were several Linux distributions in existence, including those in use today. (For example, the Slackware distribution was released in 1993, and Red Hat in 1995).

Linux microkernel controversy

Linux is an example of a monolithic kernel, that is, a kernel that does everything that is required of it within one large process. In 1991, a competing kernel design, known as the microkernel, came into vogue. Microkernels are much smaller than monolithic ones. They offload as many tasks as possible to non-nuclear processes and then manage communication between these processes.

Shortly after the release of Linux, Linus Torvalds participated in a public debate with Andrew Tanenbaum, creator of the Minix OS, which Torvalds took as a platform during the early development of Linux. The Minix system used a microkernel design, and Tanenbaum considered the monolithic design of Linux obsolete.

From a practical point of view, any design option is suitable for the end user. Linux and BSD-derived kernels use a monolithic design, while modern versions of Windows, GNU HURD, and Minix are examples of microkernels. Nevertheless, some users are still ready to argue to the point of hoarseness over this difference.

Linux timeline

Translation from Wikipedia.

  • 1991: The Linux kernel was publicly announced on 25 August by 21-year-old Finnish student Linus Benedikt Torvalds.
  • 1992: The Linux kernel is licensed under the GNU GPL. The first Linux distributions are created.
  • 1993: Over 100 developers work on the Linux kernel. With their help, the kernel adapts to the GNU environment, which creates a wide range of application types for Linux. The oldest currently (as of 2018) Linux distribution, Slackware, is released for the first time. Later that year, the Debian Project was created. Today it is the largest distribution community
  • 1994: Torvalds believes that all kernel components are ready: he releases version 1.0 of Linux. The XFree86 project provides a graphical user interface (GUI). Commercial Linux distribution makers Red Hat and SUSE publish versions 1.0 of their Linux distributions.
  • 1995: Linux ported to DEC Alpha and Sun SPARC. In subsequent years, it has been ported to more and more platforms.
  • 1996: Version 2.0 of the Linux kernel is released. The kernel can now serve multiple processors at the same time using symmetric multiprocessing (SMP), and thus becomes a serious alternative for many companies.
  • 1998: Many major companies such as IBM, Compaq and Oracle announce their support for Linux. The Cathedral and the Bazaar is first published as an essay (later as a book), resulting in Netscape publicly releasing the source code for their Netscape Communicator suite of web browsers. Netscape's actions and the essay's acclaim bring the Linux open source development model to the attention of the popular tech press. In addition, a group of programmers is beginning to develop the KDE GUI.
  • 1999: A development team begins work on the GNOME desktop environment, intended to be a free replacement for KDE, which at the time depended on the then-proprietary Qt toolkit. During the year, IBM announces a massive project to support Linux.
  • 2000: Dell announces that it is currently the No. 2 vendor of Linux-based systems in the world and the first major manufacturer to offer Linux across its entire product line.
  • 2002: Media reports that "Microsoft killed Dell Linux".
  • 2004: The XFree86 team splits and merges with the existing X standards body to form the X.Org Foundation, resulting in significantly faster development of an X server for Linux.
  • 2005: The openSUSE project begins free distribution from the Novell community. Also, the OpenOffice.org project introduces version 2.0, which then began to support the OASIS OpenDocument standards.
  • 2006: Oracle releases its own Red Hat Enterprise Linux distribution. Novell and Microsoft announce a partnership to improve compatibility and mutually protect patents.
  • 2007: Dell starts distributing laptops with Ubuntu preinstalled.
  • 2009: Red Hat market cap equals Sun. This is interpreted as a symbolic moment for the "Linux-based economy".
  • 2011: Version 3.0 of the Linux kernel is released.
  • 2012: Total revenue from the Linux server market exceeds revenue from the rest of the Unix market.
  • 2013: Android-based Google claims 75% of the smartphone market share, in terms of the number of phones shipped.
  • 2014: Ubuntu claims 22,000,000 users.
  • 2015: Version 4.0 of the Linux kernel is released.

The Linux world today

By the mid-1990s, the most important features of today's version of Linux had been created. Changes that have taken place since then include the following.

  • Core improvements. Since 1991, the Linux kernel has undergone significant changes, adding many of the features we use today. Improvements include the addition of networking features, countless device drivers, support for power management features, and support for many non-x86 processors.
  • Improved support tools. In addition to the Linux kernel, improvements have been made to the support programs it relies on—compilers, shells, GUIs, and so on.
  • Creation of new support tools. New support tools have emerged over the years. They range from simple small utilities to large desktop environments. In fact, some of these tools, such as modern desktop environments, are much more obvious to the end user than the kernel itself.
  • Creation of new distributions. As noted, the Slackware distribution was created in 1993, and Red Hat (the forerunner of the Red Hat Enterprise Linux, CentOS, and Fedora distributions) was released in 1995. Other distributions appeared in the following years, some of them important. For example, the Android system used in smartphones and tablets has become widespread over the past decade.

Linux is largely open source software created in the 1980s and 1990s. While the typical user of a desktop or embedded OS is likely to view this operating system through a GUI lens, much of what goes on under the surface is driven by the Linux kernel and open source tools, many of which have been around for decades. .

The roots of Linux can be traced back to the 70s of the 20th century. The starting point can be considered the appearance of the Unix operating system in 1969 in the United States at Bell Laboratories, a subsidiary of AT&T. Unix has become the basis for a large number of industrial-grade operating systems. The most basic of them are displayed on this timeline:

Linux owes its life most to two projects, GNU and Minix.

GNU

The history of the GNU project began in September 1983. The founder of the GNU project, Richard M. Stallman, was working at the time in the artificial intelligence laboratory of the Massachusetts Institute of Technology (MIT, Cambridge, Massachusetts). Stallman is called one of the most outstanding programmers of our time.

In the environment to which Stallman belonged, it was customary to freely exchange programs and their source codes. A Unix license from AT&T, for example, cost $40,000. Only fairly large firms could afford to buy it. And without a license, the programmer had no right to use the source codes of the system in his developments. This prevented the exchange of ideas in the field of programming and greatly slowed down the process of creating programs, since instead of borrowing a ready-made piece of code to solve a particular problem, the program developer was forced to write this piece of code again, which is akin to reinventing the wheel.

Stallman set out to change this state of affairs in programming. In 1983, he announced the start of the GNU project, the goal of which was to create a completely open operating system:

Thursday, September 27, 1983 12:35:59 PM EST

Free Unix!

After Thanksgiving, I start writing the GNU (Gnu's Not Unix) Unix compatible software system, which I will make available freely(!) to anyone who can use it. Need help in the form of time, money, software and equipment.

GNU will contain the kernel plus all the utilities needed to write and run C programs: an editor, a shell, a C compiler, a linker, an assembler, and a few more things. After that, a text formatter, YACC, an Empire game, a spreadsheet, and hundreds of other things will be added. We hope to include everything that typically comes with Unix systems and anything else you might find useful, including online and printed documentation.

GNU will be able to run Unix programs, but will not be identical to Unix. We will make improvements to the system based on our experience with other operating systems...


The abbreviation GNU stands for "GNU is Not Unix" (GNU is Not Unix). Unix has always been non-free software, meaning it takes away its users' freedom of collaboration as well as control over their computers (like Windows does today). A little later, Stallman wrote his famous GNU Manifesto, which became the basis for the GPL (GNU General Public License). The role of this license cannot be overestimated, it has changed the entire computer industry.

The main idea of ​​the GPL is that the user must have the following four rights (or four freedoms):

  • The right to run the program for any purpose (freedom 0);
  • The right to study the structure of the program and adapt it to your needs (freedom 1), which involves access to the source code of the program;
  • The right to distribute the program while being able to help others (freedom 2);
  • The right to improve the program and publish improvements, for the benefit of the entire community (freedom 3), which also involves access to the source code of the program.
The software distributed under this license can be used, copied, modified, modified, transferred or sold modified (or even unmodified) versions to others in any way, provided that the result of such processing will also be distributed under the GPL license. The last condition is the most important and defining in this license. It ensures that the results of free software developers' efforts remain open source and do not become part of any conventionally licensed product. It also distinguishes free software from free software. One of the requirements of this license is that when you sell software under the GPL, you must make the source code of that software available to anyone who wants to have access to it. The GPL license "makes software free and ensures that it stays free."

By 1990, the GNU project had created most of the components needed to run a free operating system. In addition to the Emacs text editor, Stallman created the gcc compiler (GNU C Compiler) and the gdb debugger. Being an outstanding programmer, Richard Stallman single-handedly managed to create an efficient and reliable compiler that outperforms the products of commercial vendors, created by entire groups of programmers. Since it was originally created with the goal of ensuring portability, today there are versions of this compiler for almost all operating systems. Later, compilers were created for other programming languages, including C++, Pascal, and Fortran. Therefore, now the abbreviation GCC stands for GNU Compiler Collection.

As Richard Stallman writes: "By 1990, the GNU system was almost complete, missing only one of the basic components - the kernel." The kernel (called the Hurd) was expected to be implemented as a set of server processes running on Mach, a microkernel created at Carnegie Mellon University and later at the University of Utah. The start of development was delayed pending the release of Mach, which was promised to be released as free software. But its appearance was delayed, and then a kernel developed by a Finnish student Linus Torvalds appeared, called Linux. Linus created it in an attempt to improve his home operating system, Minix, which is worth mentioning separately.

Minix

During the 1990s, personal computers based on the Intel microprocessor, equipped with operating systems from Microsoft, dominated the desktop market and also captured a significant share of the server market, a traditional area for Unix systems. Computers based on Intel and Intel-compatible processors have achieved processing power comparable to that of Unix workstations. But most commercial Unix systems did not have versions that could run on Intel hardware. Unix vendors usually worked closely with the manufacturers of particular processors, or even had ownership interests in the companies that made those processors, and were therefore interested in using their own designs. Examples include the SGI and MIPS processor lines.
Since the hardware capabilities of personal computers grew rapidly, it was natural that sooner or later Unix variants for computers based on Intel-compatible processors would appear. One such variant of the Unix-like operating system, which played a special role in the history of Linux, was developed in January 1987 by Andrew S. Tanenbaum, a professor at the University of Vries, Amsterdam, the Netherlands. Tanenbaum was one of the leading experts in the development of operating systems. He developed his Minix operating system as a teaching aid, on the example of which he showed students the internal structure of a real operating system.

Of course, as an operating system, Minix was not the height of perfection. It was focused on the Intel 80286 microprocessor, which dominated the market at the time. But she had one very important quality - open source codes. Anyone who had Tanenbaum's book "Operating Systems" could study and analyze 12,000 lines of code written in C and assembly language. It was that rare case when the source codes were not locked under seven seals in the developer's safes. An excellent author, Tanenbaum has managed to engage the most prominent minds in computer science in discussing the art of creating operating systems. Minix could also be purchased separately from the book, it could actually be installed on a personal computer. Computer students around the world pored over Tanenbaum's book, reading the codes to understand how the very system that controls their computer works. And one of those students was Linus Torvalds.

linux

In 1991, Linus Torvalds, a Finnish student, became extremely fascinated with the idea of ​​writing a Unix compatible operating system kernel for his personal computer with an Intel processor. The prototype for the future kernel was the Minix operating system: a Unix-compatible operating system for personal computers that booted from floppy disks and fit into the then very limited memory of a personal computer.

On August 25, 1991, Linus Torvalds sent the first post about his development to the comp.os.minix newsgroup:

From: torvaldsSklaava.Helsinki.Fi (Linus Benedict Torvalds)
To: Newsgroups: comp.os.inix
Subject: What do you miss the most about minix?
Summary: a small survey for my operating system Message-ID:<[email protected]>
Date: August 25, 1991 20:57:08 GMT
Organization: University of Helsinki

Hello to all minix users!

I am writing a (free) operating system (just a hobby, nothing big and professional like gnu) for AT 386(486). I've been fiddling with this since April and it looks like it will be ready soon. Let me know what you like/dislike about minix, since my OS is similar to it (among other things, it has - for practical reasons - the same physical layout of the file system).

So far, I've ported bash (1.08) and gcc (1.40) to it, and everything seems to be working. So, in the coming months, I will have something that works already, and I would like to know what features most people need. All applications are accepted, but execution is not guaranteed :-)

Linus ( [email protected])

PS. It is free of minix code and includes a multitasking file system. It is NOT portable (uses 386 task switching etc.) and will probably never support anything other than AT hard drives because I don't have anything else :-(


The new system received the name "Linux" as follows. Torvalds himself was somewhat embarrassed by the consonance of this name with his name, so he tried to call his development Freax. This name can be found in the kernl/Makefile version 0.11, and in the source codes of other programs. But Ari Lemke, who provided the place to post the system on the FTP site, named the directory pub/OS/Linux. And this name was assigned to the new OS.

The fact that Linus posted the code of his OS on the Internet was decisive in the future fate of Linux. Although in 1991 the Internet was not yet as widespread as it is today, it was mainly used by people with sufficient technical training. And from the very beginning, Torvalds received several interested responses.

Around February 1992, Linus asked everyone who was already using or testing Linux to send him a postcard. Several hundred such postcards were received from all over the world - from New Zealand, Japan, the Netherlands, the USA. This indicated that Linux was beginning to gain some notoriety.

Initially, hundreds, then thousands, then hundreds of thousands of volunteers joined the development. The system was no longer just a toy for hackers. Complemented by a host of programs developed under the GNU project, the Linux operating system became suitable for practical use. And the fact that the core of the system was distributed under the GNU General Public License guaranteed that the source codes of the system would remain free, that is, they could be copied, studied and modified without fear of running into any persecution from the developer or some commercial firm. This fact attracted more and more followers to the ranks of Linux users and supporters, primarily from among students and programmers.

By this time, a separate conference on the Internet dedicated to Linux was formed - comp.os.linux. Enthusiasts formed many user groups, and in early 1994, the first issue of the Linux Journal was published. Linux caught the attention of industrial firms, and several small companies began to develop and sell their own versions of Linux.

Initially, Linus Torvalds did not want to sell his development. I didn't want anyone else to sell it. This was clearly stated in the copyright notice placed in the COPYING file of the very first version - 0.01. Moreover, Linus's requirement imposed much stricter restrictions on the distribution of Linux than those proclaimed in the GNU license: it was not allowed to charge any money for transferring or using Linux. But as early as February 1992, he was approached for permission to charge a fee for distributing Linux floppy disks to cover the time and cost of floppy disks. In addition, it was necessary to reckon with the fact that when creating Linux, many tools freely distributed on the Internet were used, the most important of which was the GCC compiler. It is copyrighted under the GPL, which was invented by Richard Stallman. Torvalds had to revise his copyright statement, and as of version 0.12, he too switched to using the GPL license.

From a technical point of view, Linux is only the kernel of a Unix-like operating system, responsible for interacting with the computer hardware and performing tasks such as allocating memory, allocating processor time to various programs, and so on. In addition to the kernel, the operating system includes many different utilities that serve to organize user interaction with the system. The success of Linux as an operating system is largely due to the fact that by 1991, the GNU project had already developed many utilities freely distributed on the Internet. The GNU Project lacked a kernel, and the kernel would most likely have remained unclaimed if the necessary utilities were missing. Linus Torvalds was with his development in the right place at the right time. And Richard Stallman is right when he insists that the operating system should not be called Linux, but GNU/Linux. But the name Linux has historically been assigned to this OS, so we will also call it simply Linux (not forgetting the merits of Stallman and his associates).

P.S. I honestly flipped through all 36 pages of search results on Habré for the query "linux history" and did not find anything coherent on the topic, which seemed rather strange to me, given the popularity of the system among Khabrovites. Information bit by bit was collected by me from all over the Internet, the wheat is separated from the chaff and, I hope, will be of interest to you.

UPD: I was made the right point about the timeline. I reworked it, at the same time I checked all the dates again. I think it has become better and clearer.

Because system calls looked more or less standard across all implementations of UNIX, GNU programs could run (with little or no modification) on any UNIX-like operating system.

With the existing GNU tools, it would be possible to write C programs using only free software products, but there was no free UNIX-compatible kernel on which all these tools could run. In this situation, the GNU developers were forced to use one of the proprietary implementations of UNIX, that is, they were forced to follow the architectural decisions and technologies adopted in these operating systems and base their own developments on them. Stallman's dream of scientific software development free from commercially driven solutions was not feasible as long as free software development was based on a proprietary UNIX-compatible kernel, the source code of which remained a secret to developers.

linux - kernel

UNIX compatibility at this point meant that the operating system had to support the POSIX standard. POSIX is a functional model of a UNIX compatible operating system that describes how the system should behave in a given situation, but does not provide any guidance on how this should be implemented programmatically. POSIX described those properties of UNIX-compatible systems that were common to different implementations of UNIX at the time the standard was created. In particular, POSIX describes the system calls that must be processed by an operating system that is compatible with this standard.

The most important role in the development of Linux was played by the global computer networks Usenet and the Internet. In the very early stages, Linus Torvalds discussed his work and challenges with other developers on a teleconference. comp.os.minix on the Usenet network dedicated to the MINIX operating system. Linus's key decision was to publish the source code for the still inoperable first version of the kernel under the free license GNU GPL. Thanks to this and the increasingly widespread Internet, many people were able to independently compile and test this kernel, participate in the discussion and correction of errors, and also send corrections and additions to the Linus source texts. Now more than one person worked on the core, development went faster and more efficiently.

In 1992, the Linux kernel version reached version 0.95, and in 1994 version 1.0 was released, indicating that the developers finally considered that the kernel as a whole was finished and all bugs were (theoretically) fixed. Development of the Linux kernel is now a much larger community effort than it was before version 1.0. The role of Linus Torvalds himself has also changed: now he is not the main developer, but the most authoritative member of the community, traditionally evaluating the quality of the source code that should be included in the kernel, and giving his approval for their inclusion. However, the general model of free development by the community remains.

GNU and Linux

However, just as it is impossible to make an operating system without a kernel, so the kernel will be useless without utilities that would use its capabilities. Thanks to the GNU project, Linus Torvalds was immediately able to use free utilities with Linux: bash, the gcc compiler, tar, gzip, and many other well-known and widely used applications that could work with his UNIX-compatible kernel. So Linux immediately got into a good environment and, in combination with the GNU utilities, was a very interesting environment for software developers, even at a very early stage of its development.

The fundamental step forward was precisely that from the Linux kernel and GNU utilities and applications it became possible for the first time to make a completely free operating system, that is, to work with a computer and, moreover, to develop new software using only free software. The ideal of completely non-commercial development, formulated by Stallman, could now be brought to life.

Soon the theoretical possibilities for the realization of the ideal appeared, but this did not mean its immediate practical implementation. The compatibility of Linux and GNU utilities was due to the fact that both were written with an orientation towards the same standards and practices. However, within this practice (that is, in the presence of many different UNIX systems), there was a lot of room for incompatibilities and different solutions. Therefore, at the initial stage of kernel development, every GNU application that worked on Linux was another achievement for Linus. The first were bash and gcc. Thus, the combination of GNU and Linux made it possible to create a free operating system, but in itself did not yet constitute such a system, because Linux and various GNU utilities remained separate software products written by different people who did not always take into account what others were doing. . The main property of any system is the consistency of its components.

The emergence of distributions

After a certain period of development, a number of the most important GNU utilities were already running steadily on Linux. The compiled Linux kernel, with a small set of GNU utilities compiled already on Linux, constituted a set of tools for a software developer who wanted to use a free operating system on his personal computer. In this form, Linux was not only suitable for Linux development, but also represented an operating system in which it was already possible to perform some applied tasks. Of course, the first thing you could do in Linux was to write programs in C.

When the task of getting a computer with a Linux system permanently running on it became in demand and quite common, developers at Helsinki and Texas universities create their own sets of floppy disks from which the compiled kernel and basic utilities can be written to a hard disk, and then boot the operating system directly from it. These floppy disk sets were the first prototypes of modern Linux distributions - software packages that can be used to get a working operating system on your computer. It should be noted that the Linux distribution included GNU software products from the very beginning. In fact, whenever "Linux operating system" is said, "the Linux kernel and GNU utilities" is meant. The Free Software Foundation recommends calling it the Linux operating system.

However, copying all the necessary programs to the hard drive is not enough to get the operating environment suitable for the needs of the user (even if this is a very professional user). Therefore, the first sets of floppy disks can only be conditionally called distributions. To get a working operating system, some special means of installing and configuring the software is required. It is the presence of such tools that distinguishes modern Linux distributions. The other most important task of a distribution is to update regularly. Software, especially free software, is one of the fastest growing areas, so it's not enough to install Linux once, you still need to update it regularly. The first distribution in the modern sense to be widely adopted was Slackware, created by Patrick Volkerding. It was widely known to Linux users as early as 1994.

Despite the fact that with the advent of the first distributions, installing Linux no longer requires self-compilation of all programs from source, the use of Linux remained the lot of developers: the user of the operating system with it at that time of its development could almost exclusively engage in programming. At the very least, in order to perform other daily application tasks in it (for example, reading e-mail, writing articles, etc.), he had to first spend some time programming and even developing the Linux system itself in order to create appropriate applications for himself or make them work on Linux.

All Linux software was open source, so there were soon more and more Linux applications that were used by a larger community, becoming more reliable and getting more and more functionality. Eventually, the idea emerges that Linux and GNU applications for Linux can be made into complete operating systems, suitable for a very wide range of users, with the focused efforts of a small group of developers, and these systems can be sold to users for money as an analogue and alternative to existing proprietary operating systems.

The benefit of an operating system consisting entirely of free software is obvious - those who assemble this system do not have to pay anyone for the programs included in it. Moreover, further development and updating of existing programs is also carried out by the developer community for free, there is no need to pay employees who would do this. As a result, the costs of a company compiling a Linux distribution for the user are limited to paying programmers who integrate disparate applications into the system and writing programs to standardize the installation and configuration procedures of the system in order to facilitate these tasks for an unprepared user, as well as the costs of self-publishing the resulting distribution. For the end buyer, this means a fundamental reduction in the price of the operating system.

The first successful company operating under this scheme was Red Hat, which appeared in 1995. Red Hat addressed its developments not only to professional programmers, but also to ordinary users and system administrators, for whom a computer is primarily an office workplace or a work server. Focusing on the already existing offerings on the market for this class of users, Red Hat has always paid great attention to the development of applications with a graphical interface to perform typical tasks of setting up and administering the system. Red Hat's business developed quite successfully, in 1999 this company was incorporated - immediately after the issue, the shares grew in price very vigorously, but then the hype subsided. Red Hat currently has a very large share of the Linux server and workstation market. Thanks to Red Hat, the RPM package format has become very widespread in the Linux user community.

Almost at the same time as Red Hat, the Debian project was born. His goal was much the same - to make a complete distribution of Linux and free software GNU, but this project was conceived as a fundamentally non-commercial, implemented by a community of developers, the rules of interaction in which would be fully consistent with the ideals of free software. The Debian developer community is an international one, whose members interact via the Internet, and the rules of interaction between them are determined by special documents - policies (eng. policy).

The development community does not make any profit from the sale of Debian, its versions are freely distributed, available on the Internet, and can be distributed on hard media (, DVD), but even in this case, their price rarely exceeds the cost of the media and the markup that pays for the publication costs. Initially, Debian development was sponsored by the Free Software Foundation. The recipients of Debian distributions have always been primarily professional users, one way or another connected with academic software development, who are ready to read the documentation and organize the necessary system profile with their own hands, corresponding specifically to their tasks. Orientation to such an audience predetermined some trends in the development of Debian: it never had an abundance of “simple” graphical environment setup tools, all kinds of “wizards”, but a lot of attention has always been paid to the means of consistent and uniform integration of software into a single system. It was in Debian that the package manager (APT) was born. Debian is currently the most popular Linux distribution among users who are IT professionals.

Whenever free software is in demand, a lot of alternative solutions immediately pop up, as has happened with Linux distributions. After 1995, a huge number of commercial companies and free communities have arisen (and continue to arise) that make it their task to prepare and release Linux distributions. Each of them has its own characteristics, its own target audience, its own priorities. To date, several leaders have emerged on the distribution market that offer more or less universal solutions and are the most widely known and used. In addition to the already named Red Hat and Debian, among the distributions targeted at the average user, German SuSE and French Mandriva (until 2005 - Mandrake) should be mentioned, among those addressed to specialists - Gentoo. But in addition to the "big" players in the distribution market, there are many more less common distributions. Now the user who wants to install Linux is faced with the question of choosing a distribution. Selection criteria - and the tasks that are supposed to be solved using Linux, and the level of user training, and technology, and upcoming contacts with the community that is developing the distribution.

History of Linux in Russia

It so happened that in the international community of developers who started and continued to develop Linux, everyone, to one degree or another, could explain himself in English. This is not surprising, since historically English turned out to be the language of computer science and the UNIX operating system, the global Internet, and programming. In the international software development community, English has played and continues to play a role comparable to that of Latin in the scientific community of medieval Europe. But if Linux is supposed to be used not only for programming and communicating with programmers, but also for solving everyday problems, then localization is necessary, that is, the ability to communicate with a computer and using a computer in languages ​​other than English.

The goal of ASPLinux was to release Red Hat with modifications to support the Russian language. The name of their product is the same as the name of the company.

All the listed Russian manufacturers of Linux distributions exist to this day, continuing to release distributions more or less actively. However, they are losing popularity, because now popular distributions around the world, such as Ubuntu or Fedora, are quite well translated into most languages ​​of the world.

Links

  • History of Linux. Retrieved 15 August 2010.
  • A Free System for Free People (A Review of the History of the Linux Operating System). Retrieved April 7, 2011.

Wikimedia Foundation. 2010 .

  • beitar
  • Zubair Wako, Gabriel

See what "History of Linux" is in other dictionaries:

    Linux.org.ru- LOR logo and screenshot ... Wikipedia

    Linux Format (Russia)- Linux Format Cover of the Russian version of the magazine, December 2005 issue Specialization: GNU / Linux, Open Source Publication frequency: monthly Language ... Wikipedia

    linux- This article is about a group of operating systems; about the OS kernel of the same name, see: Linux Kernel. GNU and Linux ... Wikipedia

    History of Linux- Contents 1 GNU without Linux 2 Linux kernel 3 GNU and Linux 4 The emergence of distributions ... Wikipedia

    History of Firefox- Firefox (Category) Mozilla Foundation Corporation Description Features Extensions (Category) Spread Firefox Market Acceptance See also Gnuzilla GNU IceCat IceApe IceDove Portable Ed. Miro ... Wikipedia

When people say “Linux”, they most often mean a group of operating systems developed on the basis of Linux. Although, in essence, Linux is only the core of the operating system, and various other tools and libraries of the GNU projects and other resources are used to develop a full-fledged operating system. In addition, more developers are using Linux to develop and run mobile applications; Linux plays a key role in the development of devices such as Chromebooks (portable devices running the Chrome operating system, which uses a hybrid of the Linux kernel and services developed by Google as its kernel).

Linux has become popular for the following reasons:

  • relevance of distributions and active support by developer communities;
  • the ability to run on a variety of hardware;
  • low requirements for resources;
  • the ability to install programs from existing repositories.

But the list of reasons, of course, is not limited to these; there are not only practical but also ethical reasons. For example, many developers see Linux as an expression of openness, self-expression, and accessibility.

Development history

The roots of Linux go to two other projects: Unix And Multics, which aimed to develop a multi-user operating system.

What is Unix?

Unix is ​​a collection of cross-platform multi-user and multi-tasking operating systems.

It can be said right away that Unix systems are currently one of the most historically important operating systems. The influence of Unix extended to programming languages: the C language was developed during the development of Unix systems.

Unix was developed by Bell Laboratories Corporation - in 1969 they showed the first Unix system. The further, the more popular Unix systems gained - in the 70s they began to be installed on computers in educational institutions.

When creating Unix, the developers set themselves three main tasks:

  1. Using the minimum number of features, keeping it simple.
  2. Commonality: the same methods and mechanisms are used in different cases.
  3. Combining programs to solve problems rather than developing new programs from scratch.

As for the distinguishing features of Unix, these are:

  1. Almost constant use of the command line.
  2. Using Containers.
  3. System setup through the use of simple (often text) files.

Unix has its own philosophy. Programmer Douglas McIlroy, who developed the Linux pipeline, defined the following rules:

Write programs that do one thing and do it well.

Write programs that work together.

Write programs that support text streams because it's a generic interface.

One of the problems that has affected Unix is ​​the existence of different versions and many programs that developers wrote to suit their needs; due to poor compatibility, programs running on one version of Unix might not work on machines running other versions. As a result, it was decided to create a common document that would specify the standards that developers should follow.

In 1983, the creation of GNU (GNU's Not UNIX), a Unix-like operating system, was announced. This happened under the influence of the idea of ​​the founder of the project, Richard Stallmann, about the need to create a freely distributed operating system and, in general, open source software.

Richard Stallmann also founded the free software movement and articulated four rights that a user should have: he can run the program for any purpose, he can study the program and modify it according to his needs, he can distribute the program to help others, and he can publish program improvements to help the community as a whole. All this, first of all, said that the source code of the program should be available to everyone.

It was this thought that inspired Linus Torvalds, the creator of Linux, to begin work on his operating system in 1991.Linux, like GNU, is a Unix-like system, that is, a system influenced by Unix.

In the future, it is the GNU / Linux system that will become the system that is now simply called Linux.

What is Multis?

Multics - or Multiplexed Information and Computing Service ("Multiplex Information and Computing Service") - is one of the very first operating systems in which a flat data storage model was implemented and the concept of files (segments) was clearly separated. The creation of Multics began in 1964. Bell developers worked on the system Laboratories - in a few years, part of the developers will begin work on the creation of Unix.

Multics was developed in order, firstly, to enable a large number of users to use computer resources at the same time; second, enable users to share data; thirdly, to ensure a good speed of working with data.

However, the main computational goals were not achieved when the first version of the system was released, and Bell Laboratories turned its interest to another project, as a result of which Unix was born.

History of Linux

The history of Linux begins in 1991, when the Finnish programmer Linus Torvalds began to develop the operating system kernel for his computer. He posted his developments on the server, and this became a key event in the history of Linux. First, tens, then hundreds and thousands of developers supported his project - by common efforts, a full-fledged operating system was born.

As already mentioned, Linux was significantly influenced by the Unix system, it can be seen even by the name. However, initially the project was called Freax - from the words “free” (free) and “freak” (strange), but later the name was changed to a hybrid of the name of the creator (Linus) and Unix.

The logo of Linux is Tux, a penguin drawn in 1996 by programmer and designer Larry Ewing. However, Linus Torvalds himself came up with the idea to use the penguin. Now Tux is a symbol not only of Linux, but of free software in general.

The first official version of Linux 1.0 was released in 1994; the second version went in 1996. The Linux trademark was registered a year earlier, in 1995.

From the beginning to this day, Linux has been distributed as free software under the GPL. This means that any user can see the source code of the operating system - and not only see it, but also modify it. The only condition is that the modified, modified code must also be available to everyone and distributed under the GPL license. This is important because it allows developers to use the code and at the same time not be afraid of problems due to copyright.

Linux owes much of its success to GNU: at the time Linux was released, there were already many freeware utilities from this project that could be used with the developed kernel.

In fact, Linux is still the kernel of a Unix-like operating system that performs various low-level tasks. At the same time, the GNU project needed a kernel - development by Linus Torvalds was very timely.

Now, due to its flexibility, Linux is used on many different devices, ranging from computers to servers and mobile devices.

Popular Linux distributions

A Linux distribution is the definition of an operating system that uses the Linux kernel and can be installed on a user's machine. Distributions usually contain not only the kernel and the operating system itself, but also useful applications: editors, players, database tools, and other software.

That is, as already mentioned at the beginning of the article, a Linux distribution is an operating system that consists of the Linux kernel and utilities that are developed under GNU.

The number of existing Linux distributions exceeds 600 varieties, more than 300 of which are constantly being improved and updated.