Chapter 4 Essential Unix/Linux Terminal Knowledge
Unix was developed at AT&T Bell Labs in the 1960s. Formally “UNIX” is a trademarked operating system, but when most people talk about “Unix” they are talking about the shell, which is the text-command-driven interface by which Unix users interact with the computer.
The Unix shell has been around, largely unchanged, for many decades because it is awesome. When you learn it, you aren’t learning a fad, but, rather, a mode of interacting with your computer that has been time tested and will likely continue to be the lingua franca of large computer systems for many decades to come.
For bioinformatics, Unix is the tool of choice for a number of reasons: 1) complex analyses of data can be undertaken with a minimum of words; 2) Unix allows automation of tasks, especially ones that are repeated many times; 3) the standard set of Unix commands includes a number of tools for managing large files and for inspecting and manipulating text files; 4) multiple, successive analyses upon a single stream of data can be expressed and executed efficiently, typically without the need to write intermediate results to the disk; 5) Unix was developed when computers were extremely limited in terms of memory and speed. Accordingly, many Unix tools have been well optimized and are appropriate to the massive genomic data sets that can be taxing even for today’s large, high performance computing systems; 6) virtually all state-of-the-art bioinformatic tools are tailored to run in a Unix environment; and finally, 7) essentially every high-performance computer cluster runs some variant of Unix, so if you are going to be using a cluster for your analyses (which is highly likely), then you have gotta know Unix!
4.1 Getting a bash shell on your system
A special part of the Unix operating system is the “shell.” This is the system that interprets commands from the user. At times it behaves like an interpreted programming language, and it also has a number of features that help to minimize the amount of typing the user must do to complete any particular tasks. There are a number of different “shells” that people use. We will focus on one called “bash,” which stands for the “Bourne again shell.” Many of the shells share a number of features.
Many common operating systems are built upon Unix or upon Linux—an open-source flavor of Unix that is, in many scenarios, indistinguishable. Hereafter we will refer to both Unix and Linux as “Unix” systems). For example all Apple Macintosh computers are built on top of the Berkeley Standard Distribution of Unix and bash is the default shell. Many people these days use laptops that run a flavor of Linux like Ubuntu, Debian, or RedHat. Linux users should ensure that they are running the bash shell. This can be done by typing “bash” at the command line, or inserting that into their profile. To know what shell is currently running you can type:
at the Unix command line. If you are running bash
the result should
be
PCs running Microsoft Windows are something of the exception in the computer world, in that they are not running an operating system built on Unix. However, Windows 10 now allows for a Linux Subsystem to be run. For Windows, it is also possible to install a lightweight implementation of bash (like Git Bash). This is helpful for learning how to use Unix, but it should be noted that most bioinformatic tools are still difficult to install on Windows.
Mac computers used to come from the store configured to use bash as the default
shell; however, since the Catalina OS, Apple now sets a different shell—the Z-shell,
or zsh—as the default. This is apparently because of some changes to bash’s
license status. However, an old version of bash is still on the Apple system, and it
can be set as the default (which is what I do, because bash is what is used in
most bioinformatics). Doing so involves changing the default shell with the chsh
command (chsh -s /bin/bash
) and then adding a line that looks like:
to your ~/.bashrc
file. (This will all make more sense after you have digested
the contents of this chapter.)
4.3 The anatomy of a Unix command
Nearly every Unix command that you might invoke follows a certain pattern. First comes
the command
itself. This is the word that tells the system the name of the command
that you are actually trying to do. After that, often, you will provide a series
of options that will modify the behavior of the command (for example, as we have seen, -l
is an option to the ls
command). Finally, you might then provide some arguments to the
functions. These are typically paths to files or directories that you would like the
command to operate on. So, in short, a typical Unix command invocation will look
like this:
command
options arguments
Of course, there are exceptions. For example, when invoking Java-based programs from your shell, arguments might be supplied in ways that make them look like options, etc. But, for the most part, the above is a useful way of thinking about Unix commands.
Sometimes, especially when using samtools
or bcftools
, the command
part of the
command line might including a command and a subcommand, like samtools view
or
bcftools query
. This means that the operating system is calling the program
samtools
(for example), and then samtools interprets the next token (view
) to
know that it needs to run the view
routine, and interpret all following
options in that context.
We will now break down each element in
command
options arguments.
4.3.1 The command
When you type a command at the Unix prompt, whether it is a command like ls
or
one like samtools
(Section 19.4), the Unix system has to search around
the filesystem for a file that matches the command name and which provides the actual
instructions (the computer code, if you will) for what the command will actually do.
It cannot be stressed enough how important it is to
understand where and how the bash shell searches for these command files. Understanding this
well, and knowing how to add directories that the shell searches for executable
commands will alleviate a lot of frustration that often arises with Unix.
In brief, all Unix shells (and the bash shell specifically) maintain
an environment variable called PATH
that is a colon-separated list of the pathnames
where the shell searches for commands. You can print the PATH
variable using
the echo
command:
On a freshly installed system without many customizations the PATH
might look like:
which is telling us that, when bash is searching for a command, it searches for a file
of the same name as the command first in the directory /usr/bin
. If it finds it there, then
it uses the contents of that file to invoke the command. If it doesn’t find it there,
then it next searches for the file in directory /bin
. If it’s not there, it searches
in /usr/sbin
, and finally in /sbin
. If it does not find the command in any of those directories
then it returns the error command not found
.
When you install programs on your own computer system, quite often the installer will modify
a system file that specifies the PATH
variable upon startup. Thus after installing some
programs that use the command line on a Mac system, the “default” PATH
might look like:
4.3.2 The options
Sometimes these are called flags, and they provide a convenient way of
telling a Unix command how to operate. We have already seen a few of them,
like the -a
, -l
or -d
options to ls
.
Most, but not all, Unix tools follow the convention that options specified by a single
letter follow a single dash, while those specified by multiple letters follow two dashes.
Thus, the tar
command takes the single character options -x
, -v
, and -f
, but also takes
an option named like --check-links
. Some utilities also have two different names—a single-letter
name and a long name—for many options.
For example, the bcftools view
program uses either -a
or --trim-alt-alleles
to invoke the option
that trims alternate alleles not seen in a given subset of individuals. Other tools, like BEAGLE, are perfectly content to
have options that are named with multiple letters following just a single dash.
Sometimes options take parameter values, like bcftools view -g het
. In that case, het
is
a parameter value. Sometimes the parameter values are added to the option with an equals-sign.
With some unix utilities’ single-letter options can be bunged together
following a single dash, like, tar -xvf
being synonymous with tar -x -v -f
. This
is not universal, and it is not recommended to expect it.
Holy Cow! This is not terribly standardized, and probably won’t make sense until you really get in there and starting playing around in Unix…
4.3.3 Arguments
These are often file names, or other things that are not preceded by an option flag.
For example, in the ls
command:
-lrt
is giving ls
the options -l
, -r
, and -t
and dir3
is the argument—the
name of the directory whose contents you should list.
4.3.4 Getting information about Unix commands
Zheesh! The above looks like a horrible mish-mash. How do we find out how to use/invoke different commands and programs in Unix? Well, most programs are documented, and you have to learn how to read the documentation.
If a Unix utility is properly installed, you should be able to find its manual page with
the man
command. For example, man ls
or man tar
. These “man-pages”, as the
results are called, have a fairly uniform format. They start with a summary of what the
utility does, then then show how it is invoked and what the possible options are by
showing a skeleton in the form:
“command
options arguments”
and usually square brackets are put around things that are not required. This format
can get quite ugly and hard to parse for an old human brain, like mine, but stick with it.
If you don’t have a man-page for a program, you might try invoking the program with
the --help
option, or maybe with no option at all. Sometimes, that returns a textual
explanation of how the program should be invoked, and the options that are avaiable.
4.4 Handling, Manipulating, and Viewing files and streams
In Unix, there are two main types of files: regular files which are things like text files, figures, etc.—Anything that holds data of some sort. And then there are “special” files, which include directories which you’ve already seen, and symbolic links which we will talk about later.
4.4.1 Creating new directories
You can make a new directory with:
where path
is a path specification (either absolute or relative). Note that if you
want to make a directory within a subdirectory that does currently not exist, for example:
when new-dir
does not already exist, then you have to either create new-dir
first, like:
or you have to use the -p
option of mkdir
, which creates all necessary parent directories
as well, like:
If there is already a file (regular or directory) with the same path specifiation as a directory you are
trying to create, you will get an error from mkdir
(unless you are using the -p
option,
in which case mkdir
doesn’t do anything, but neither does it complain about the fact that a
directory already exists where you wanted to make one).
4.4.2 Fundamental file-handling commands
For the day-to-day business of moving, copying, or removing files in the file system, the three main Unix commands are:
mv
for moving files and directoriescp
for copying files and directoriesrm
for removing files and directories
These obviously do different things, but their syntax is somewhat similar.
4.4.2.1 mv
mv
can be invoked with just two arguments like:
mv this there
which moves the file (or directory) from the path this
to the path there
.
- If
this
is a regular file (i.e. not a directory), and:there
is a directory,this
gets moved inside ofthere
.there
is a regular file that exists, thenthere
will get overwritten, becoming a regular file that holds the contents ofthis
.there
does not exist, it will be created as regular file whose contents are identical to those ofthis
.
- If
this
is a directory and:there
does not exist in the filesystem, the directorythere
will be made and its contents will be the (former) contents ofthis
- if
there
already exists, and is a directory, then the directorythis
will be moved inside of the directorythere
(i.e. it will becomethere/this
). - if
there
already exists, but is not a directory, then nothing will change in the filesystem, but an an error will be reported. In all cases, whatever used to exist at paththis
will no longer be found there.
And mv
can be invoked with multiple arguments, in which case the last one must be a directory
that already exists that receives all the earlier arguments inside it. So, if you already have
a directory named dest_dir
then you can move a lot of things into it like:
You can also write that as as
which makes its meaning a little more clear, but there is no requirement that the
final argument have a trailing /
.
Note, if any files in dest_dir
have the same name as the files you are moving into
dest_dir
they will get overwritten.
So, you have must be careful not to overwrite files that you don’t want to overwrite.
Using mv
can be dangerous in that way.
4.4.2.2 cp
This works much the same way as mv
with two different flavors:
and
The result is very much like that of mv
, but instead of moving the file
from one place to another (an operation that can actually be done without moving the
data within the file to a different place on the hard drive), the cp
command actually
makes a full copy of files. Note that, if the files are large, this can take a long time.
4.4.2.3 rm
Finally we get to the very spooky rm
command, which is short for “remove.” If you
say “rm myfile.txt” the OS will remove that file from your hard drive’s directory. The data
that were in the file might live on for some time on your hard drive—in other words, by default,
rm
does not wipe the file off your hard drive, but simply “forgets” where to look for that file. And
the space that file took up on your hard drive is no longer reserved, and could easily be
overwritten the next time you write something to disk. (Nonetheless, if you do rm
a file, you should never expect to be able to get it back). So, be very careful about using rm
. It takes an -r
option for recursively removing directories and all of
their contents.
When used in conjunction with globbing, rm
can be very useful. For example, if you wanted
to remove all the files in a directory with a .jpg
extension, you would do rm *.jpg
from
within that directory. However, it’s a disaster to accidentally remove a number of files you
might not have wanted to. So, especially as you are getting familiar with Unix, it is
worth it to experiment with your globbing using ls
first, to see what the results are,
and, only when you are convinced that you won’t remove any files that you do not
want to trash, should you
then use rm
to remove those files.
4.4.3 “Viewing” Files
When using a Graphical User Interface, or GUI,
when you interact with files on your computer,
you typically open the files with some application. For example, you open Word files
with Microsoft Word. When working on the Unix shell, that same paradigm does not
really exist. Rather, (apart from a few cases like the text editors, nano
, vim
and
emacs
) instead of opening a file and letting the user interact with it, the shell is
much happier just streaming the contents of the file to the terminal.
The most basic of such commands is the cat
command, which catenates the contents
of a file into a very special data stream called stdout, which is short
for “standard output.” If you don’t provide any other instruction, data that gets
streamed to stdout just shoots by on your terminal screen. If the file is very large, it might
do this for a long time. If the file is a text file then the data in it can be
written out in letters that are recognizable. If it is a binary file then there is
no good way to represent the contents as text letters, and your screen will be filled with
all sorts of crazy looking characters.
It is generally best not to cat
very large files, especially binary ones. If you do and
you need to stop the command from continuing to spew stuff across your screen, you can type
cntrl-c
which is the universal Unix command for “kill the current process happening on the
shell.” Usually that will stop it.
Sometimes you want to just look at the top of a file. The head
command
shows you the first 10 lines of a file. That is valuable. The less
command
shows a file one screenful at a time. You can hit the space bar to see the next screenful,
and you can hit q
to quit viewing the file. If the file has very long lines
(as might be the case with a VCF file) then you can give less
the -S
option
to not wrap lines. In that case, the left and right arrow keys can be used to
scroll through the long lines.
Try navigating to a file and using cat
, head
, and less
on it.
One particularly cool thing about cat
is that if you say
it will catenate the contents of both files, in the order they are listed on the command line, to stdout.
Now, one Big Important Unix Fact is that many programs written to run in the
Unix shell behave in the same way regarding their output: they write their
output to stdout. We have already seen this with ls
: its output just
gets written to the screen, which is where stdout goes by default.
4.4.4 Redirecting standard output: >
and >>
Unix starts to get really fun when you realize that you can “redirect” the
contents of stdout from any command (or group of commands…see the next chapter!)
to a file. To do that, you merely follow the command (and all its options and arguments)
with > path
where path
is the path specifying the file into which you
wish to redirect stdout.
Witness, try this:
# echo three lines of text to a file in the /tmp directory
echo "bing
bong
boing" > /tmp/file1
# echo three more lines of text to another file
echo "foo
bar
baz" > /tmp/file2
# now view the contents of the first file
cat /tmp/file1
# and the second file:
cat /tmp/file2
It is important to realize that when you redirect output into a file
with >
, any contents that previously existed in that file will
be deleted (wiped out!). So be careful about redirecting. Don’t
accidentally redirect output into a file that has valuable data in it.
The >>
redirection operator does not delete the destination file before
it redirects output into it. Rather, >> file
means “append stdout to the contents that already exist in file
.” This can be very useful
sometimes.
4.4.5 stdin, <
and |
Not only do most Unix-based programs deliver output to standard output, but most utilities can also receive input from a file stream called stdin which is short for “standard input.”
If you have data in a file that you want to send into standard input
for a utility, you can use the <
like this:
But, since most Unix utilities also let you specify the file as an argument, this is not used very much.
However, what is used all the time in Unix, and it is one of the things
that makes it super fun, is the pipe, |
, which says, “take stdout coming
out of the command on the left and redirect it into stdin going into
the command on the right of the pipe.
For example, if I wanted to count the number of files and directories stored in my git-repos
directory, I could do
which pipes the output of ls -dl
(one line per file) into the stdin for the wc
command, which
counts the number of lines, words, and letters sent to its standard input. So, the output tells
me that there are 174 files and directories in my directory Documents/git-repos
.
Note that pipes and redirects can be combined in sequence over multiple operations or commands. This is what gives rise to the terminology of making “Unix pipelines:” the data are like streams of water coming into or out of different commands, and the pipes hook up all those streams into a pipeline.
4.4.6 stderr
While output from Unix commands is often written to stdout, if anything goes wrong with
a program, then messages about that get written to a different stream called stderr, which, you
guessed it! is short for “standard error”. By default, both stdout and stderr get written
to the terminal, which is why it can be hard for beginners to think of them as separate streams.
But, indeed, they are. Redirecting stdout with >
, that does not redirect stderr.
For example. See what happens when we ask ls
to list a file that does not exist:
The error message comes back to the screen. If you redirect the output it still comes back to the screen!
If you want to redirect stderr, then you need to specify which stream
it is. On all Unix systems, stderr is stream #2, so the 2>
syntax can be
used:
Then there is no output of stderr to the terminal, and when you cat
the output
file, you see that it went there!
Doing bioinformatics, you will find that there will be failures of various programs. It is essential when you write bioinformatic pipelines to redirect stderr to a file so that you can go back, after the fact, to sleuth out why the failure occurred. Additionally, some bioinformatic programs write things like progress messages to stderr so it is important to know how to redirect those as well.
4.4.7 Symbolic links
Besides regular files and directories, a third type of file in Unix is called a symbolic link. It is a special type of file whose contents are just an absolute or a relative path to another file. You can think of symbolic links as “shortcuts” to different locations in your file system. There are many useful applications of symbolic links.
Symbolic links are made using the ln
command with the -s
option. For example,
if I did this in my home directory:
then srs
becomes a file whose full listing (from ls -l srs
) looks like:
4.4.8 File Permissions
Unix systems often host many different users. Some users might belong to the same research group, and might like to be able to read the files (and/or use the programs) that their colleagues have in their accounts.
The Unix file system uses a system of permissions that gives rights to various
classes of users to read, write, or execute files. The permissions associated with
a file can be viewed using ls -l
. They are captured in the first column which might
look something like -rwxr-xr-x
. When you first start looking at these, they can
be distressingly difficult to visually parse. But you will get better at it! Let’s
start breaking it down now.
The file description string, in a standard Unix setting, consists of 10 characters.
- The first tells what kind of file it is:
-
= regular file,d
= directory,l
= symbolic link. - The next group of three characters denote whether the owner/user of the file has
permission to either read, write, or execute the file.
- The following two groups of three characters are the same thing for users within the users group, and for all other users, respectively.
Here is a figure from the web that we can talk about:
Permissions can be changed with the chmod
command. We will talk in class about
how to use it with the octal representation of permissions.
4.5 Customizing your Environment
Previously we saw how to modify your command prompt to tell you what the current
working directory is (remember PS1='[\W]--% '
). The limitation of giving that
command on the command line is that if you logout and then log back in again, or open
a new Terminal window, you will have to reissue that command in order to achieve
the desired look of your command prompt. Quite often a Unix user would like to make
a number of customization to the look, feel, and behavior of their Unix shell.
The bash shell allows these customization to be specified in two different files that
are read by the system so as to invoke the customization. The two files are hidden files
in the home directory: ~/.bashrc
and ~/.bash_profile
. They are used by the Unix system
in two slightly different contexts, but for most purposes, you, the user, will not need or
even want to distinguish between the different contexts. Managing two separate files
of customization is unnecessary and requires duplication of your efforts, and can lead to inconsistent
and confusing results, so here is what we will do:
- Keep all of our customization in
~/.bashrc
. - Insert commands in
~/.bash_profile
that say, “Hey computer! If you are looking for customization in here, don’t bother, just get them straight out of~/.bashrc
.
We take care of #2, by creating the file ~/.bash_profile
to have the following
lines in it:
Taking care of #1 is now just a matter of writing commands into ~/.bashrc
. In the following
are some recommended customization.
4.5.1 Appearances matter
Some customization just change the way your shell looks or what type of output
is given from different commands. Here are some lines to add to your ~/.bashrc
along with some discussion of each.
This gives a tidier and more informative command prompt. The export
command before
it tells the system to pass the value of this environment variable, PS1
, along
to any other shells that get spawned by the current one.
This makes it so that each time you invoke the ls
command, you do so with the
options -G
, -F
, and -h
. To find out on your own what those options do, you can
type man ls
at the command line and read the output, but briefly: -G
causes directories and
different file types to be printed in different colors, -F
causes a /
to be printed after
directory names, and other characters to be printed at the end of the names of different
file types, and -h
causes file sizes to be printed in an easily human-readable form when
using the -l
option.
4.5.2 Where are my programs/commands at?!
We saw in Section 4.3.1 that bash searches the directories listed in the
PATH
variable to find commands and executables. You can modify the PATH variable to include
directories where you have installed different programs. In doing so, you want to make sure
that you don’t lose any of the other directories in PATH
, so there is a certain way to
go about redefining PATH
. If you want to add the path /a-new/program/directory
to your
PATH variable you do it like this:
4.6 A Few More Important Keystrokes
If a command “gets stuck” or is running longer than it should be, you can usually
kill/quit it by doing cntrl-c
.
Once you have given a command, it gets stored in your bash history. You can use the up-arrow key to cycle backward through different commands in your history. This is particularly useful if you are building up complex pipelines on the command line piece by piece, looking at the output of each to make sure it is correct. Rather than re-typing what you did for the last command line, you just up-arrow it.
Once you have done an up-arrow or two, you can cycle back down through your history with a down-arrow.
Finally, you can search through your bash history by typing cntrl-r
and then typing
the word/command you are looking for. For example, if, 100 command lines back, you
used a command that involved the program awk
, you can search for that by typing
cntrl-r
and then typing awk
.
One last big thing to note: the #
is considered a comment character in bash.
This means that any text following a #
(unless it is backslash-escaped or inside quotation marks),
until the next line ending,
will be ignored by the shell.
4.7 A short list of additional useful commands.
Everyone should be familiar with the following commands, and the options that follow them on each line below. One might even think of scanning the manual page for each of these:
echo
cat
head
,-n, -c
tail
,-n
less
,-S
sort
,-n -b -k
paste
cut
,-d
tar
,-cvf, -xvf
gzip
,-c
du
,-h -C
,wc
date
uniq
chmod
,u+x
,ug+x
grep
4.8 Two important computing concepts
4.8.1 Compression
Most file storage types (like text files) are a bit wasteful in terms of
file space: every character in a text file takes the same number of bytes to
store, whether it is a character that is used a lot, like s
or e
, or whether
it is a character that is seldom seen in many text files, like ^
. Compression
is the art of creating a code for different types of data that uses fewer bits to
encode “letters” (or “chunks” of data) that occur frequently and it reserves codewords of
more bits to encode less frequently occurring chunks in the data. The result is that the
total file size is smaller than the uncompressed version. However, in order to read it,
the file must be decompressed.
In bioinformatics, many of the files you deal with will be compressed, because that
can save many terabytes of disk space. Most often, files will be compressed using the
gzip
utility, and they can be uncompressed with the gunzip
command. Sometimes
you might want to just look at the fist part of a compressed file. If the file is compressed
with gzip
, you can decompress to stdout by using gzcat
and then pipe it to head
, for example.
A central form of compression in bioinformatics is called bgzip
compression
which compresses files into a series of blocks of the same size, situated in such a way that
it is possible to index the contents of the file so that certain parts of the file can be
accessed without decompressing the whole thing. We will encounter indexed compressed files a
lot when we start dealing with BAM and vcf.gz files.
4.8.2 Hashing
The final topic we will cover here is the topic of hashing, an in particular the idea of “fingerprinting” files on ones computer. This process is central to how the git version control system works, and it is well worth knowing about.
Any file on your computer can be thought of as a series of bits, 0’s and 1’s, as, fundamentally, that is what the file is. A hashing algorithm is an algorithm that maps a series of bits (or arbitrary length) to a short sequence of bits. The SHA1 hashing algorithm maps arbitrary sequences of bits to a sequence of 160 bits.
There are \(2^{160} \approx 1.46 \times 10^{48}\) possible bit sequences of length 160. That is a vast number. If your hashing algorithm is well randomized, so that bit sequences are hashed into 160 bits in a roughly uniform distribution, then it is exceedingly unlikely that any two bit sequences (i.e. files on your filesystem) will have the same hash (“fingerprint”) unless they are perfectly identical. As hashing algorithms are often quite fast to compute, this provides an exceptionally good way to verify that two files are identical.
The SHA1 algorithm is implemented with the shasum
command. In the following,
as a demonstration, I store the recursive listing of my git-repos
directory into
a file and I hash it. Then I add just a single line ending to the end
of the file, and hash that, to note that the two hashes are not at all similar
even though the two files differ by only one character:
[~]--% ls -R Documents/git-repos/* > /tmp/gr-list.txt
[~]--% # how many lines is that?
[~]--% wc /tmp/gr-list.txt
93096 88177 2310967 /tmp/gr-list.txt
[~]--% shasum /tmp/gr-list.txt
1396f2fec4eebdee079830e1eff9e3a64ba5588c /tmp/gr-list.txt
[~]--% # now add a line ending to the end
[~]--% (cat /tmp/gr-list.txt; echo) > /tmp/gr-list2.txt
[~]--% # hash both and compare
[~]--% shasum /tmp/gr-list.txt /tmp/gr-list2.txt
1396f2fec4eebdee079830e1eff9e3a64ba5588c /tmp/gr-list.txt
23bff8776ff86e5ebbe39e11cc2f5e31c286ae91 /tmp/gr-list2.txt
[~]--% # whoa! cool.
4.9 Unix: Quick Study Guide
This is just a table with quick topics/commands/words in it. You should understand each and be able to tell a friend a lot about each one. Cite it as 4.1
bash | absolute path | relative path |
/ at beginning of path |
/ between directories |
home directory |
~ |
current working directory | pwd |
cd |
. |
.. |
cd - |
basename | PS1 |
TAB-completion | ls (-a , -d , -R ) |
globbing |
* | ? | [0-9] |
[a-z] | [^CDcd] | {png,jpg,pdf} |
echo |
man command |
mkdir |
mv |
cp |
rm |
cat |
head |
less (-S ) |
stdout | stdin | stderr |
> |
< |
| |
ln -s |
symbolic link | PATH |
-rw-r--r-- |
.bashrc | .bash_profile |
sort , -n -b -k |
paste |
cut , -d |
tar , -cvf, -xvf |
gzip |
du , -h -C , |
wc |
date |
uniq |
cntrl-c | cntrl-r | # |
up-arrow/down-arrow | chmod , ug+x , 664 |
grep |