Probability and Random Processes for Electrical Engineering presents a carefully motivated, accessible, and interesting introduction to probability and random processes. It is designed to allow the instructor maximum flexibility in the selection of topics. In addition to the standard topics taught in introductory courses on probability, random variables, and random processes, the book includes sections on modeling, basic statistical techniques, computer simulation, reliability, and entropy, as well as concise but relatively complete introductions to Markov chains and queueing theory.
The complexity of the systems encountered in electrical and computer engineering calls for an understanding of probability concepts and a facility in the use of probability tools from an increasing number of B.S. degree graduates. The introductory Course should therefore teach the student not only the basic theoretical concepts but also how to solve problems that arise in engineering practice. This course requires that the student develop problem-solving skills and understand how to make the transition from a real problem to a probability model for that problem. Relevance to Engineering Practice
Motivating students is a major challenge in introductory probability courses. Instructors need to respond by showing students the relevance of probability theory to engineering practice. Chapter 1 addresses this challenge by discussing the role of probability models in engineering design. Practical applications from various areas of electrical and computer engineering are used to show how averages and relative frequencies provide the proper tools for handling the design of systems that involve randomness. These application areas are used in examples and problems throughout the text. From Problems to Probability Models
The transition from real problems to probability models is shown in several ways. First, important concepts are usually developed by presenting real data or computer-simulated data. Second, sections on basic statistical techniques are integrated throughout the text. These sections demonstrate how statistical methods provide the link between theory and the real world. Finally, the significant random variables and random processes are developed using model-building arguments that range from simple to complex. For example, in Chapter 2 and 3, text discussion proceeds from coin tossing to Bernoulli trials. It then continues to the binomial and geometric distributions, and finally proceeds via limiting arguments to the Poisson, exponential, and Gaussian distributions. Examples and Problems
Numerous examples in every section are used to demonstrate analytical and problem-solving techniques, develop concepts using simplified cases, and illustrate applications. The text includes over 700 problems, identified by section to help the instructor select homework problems. Additional sets of problems requiring cumulative knowledge are provided at the end of each chapter. Answers to selected problems are included at the end of the text. A Student Solutions Manual accompanies this text to develop problem-solving skills. A sampling of 25% of carefully worked out problems has been selected to help students understand concepts presented in the text. An Instructors Solutions Manual with complete solutions is also available. Computer Methods
The development of an intuition for randomness can be aided by the use of computer exercises. Appendix C contains computer programs for generating several well-known random variables. The resulting data from computer-generated random numbers and variables can be analyzed using the statistical methods introduced in the text.
Sections on computer methods have been integrated into the text rather than isolated in a separate chapter because performing the computer exercises during lessons helps students to learn basic probability concepts. It should be noted that the computer methods introduced in Sections 2.7, 3.11, and 4.10 do not necessarily require entirely new lectures. The transformation method in Section 3.11 can be incorporated into the discussion on functions of a random variable. Similarly, the material in Section 4.10 can be incorporated into the discussion on transformations of random vectors. Random Variables and Continuous-Time Random Processes
Discrete-time random processes provide a crucial "bridge" in going from random variables to continuous-time random processes. Care is taken in the first five chapters to lay the proper groundwork for this transition. Thus sequences of dependent experiments are discussed in Chapter 2 as a preview of Markov chains. In Chapter 4, emphasis is placed on how a joint distribution generates a consistent family of marginal distributions. Chapter 5 introduces sequences of independent identically distributed (iid) random variables. Chapter 6 considers the sum of an iid sequence to produce important examples of random processes. Throughout Chapters 6 and 7, a concise development of the concepts is achieved by developing discrete-time and continuous-time results in parallel. Markov Chains and Queueing Theory
Markov chains and queueing theory have become essential tools in communication network and computer system modeling. In the introductory course on probability only a few changes need to be made to accommodate these new requirements. The treatment of conditional probability and conditional expectation needs to be modified, and the Poisson and gamma random variables need to be given greater prominence. In an introductory course on random processes a new balance needs to be struck between the traditional discussion of wide-sense stationary processes and linear systems and the discussion of Markov chains and queueing theory. The "optimum" balance between these two needs will surely vary from instructor to instructor, so the text includes more material than can be covered in one semester in order to give the instructor leeway to strike a balance. Suggested Syllabi
The first five chapters form the basis of a one-semester introduction to probability. In addition to the optional sections on computer methods, these chapters also include optional sections on combinatorics, reliability, confidence intervals, and basic results from renewal theory. In a one-semester course, it is possible to provide an introduction to random processes by omitting all the starred sections in the first five chapters and covering instead the first part of Chapter 6. The material in the first five chapters has been used at the University of Toronto in an introductory junior-level required course for electrical engineers.
A one-semester course on random processes with Markov chains can be taught using Chapters 6 though 8. A quick introduction to Markov chains and queueing theory is possible by covering only the first three sections of Chapter 8 and then proceeding to the first few sections in Chapter 9. A one-semester introduction to queueing theory can be taught from Chapters 6, 8, and 9. Changes in the Second Edition
The only changes in the second edition that affect the first half of the book, and hence introductory courses on probability, involve the addition of more examples and problems. In keeping with our goal of giving the instructor flexibility in the selection of topics, we have expanded the optional section on reliability (Section 3.10) and introduced a new optional section on entropy (Section 3.12). Care has been taken not just to define the various quantities associated with entropy but also to develop an understanding of the interpretation of entropy as a measure of uncertainty and information.
The most significant change to the second edition is the addition of material to make the text more suitable for a course that provides a more substantial introduction to random processes:
In Chapter 4, a section on the joint characteristic function has been added and the discussion of jointly Gaussian random variables has been expanded.
Section 5.5 discusses the various types of convergence of sequences of random variables. A carefully selected set of examples is presented to demonstrate the differences in the various types of convergence.
Section 6.6 uses these results to develop the notions of mean square continuity, derivatives, and integrals of random processes. This section presents the relations between the Wiener process and white Gaussian noise. It also develops the Ornstein-Uhlenbeck process as the transient solution to a first-order linear system driven by noise.
Section 6.8 uses Fourier series to introduce the notion of representing a random process by a linear combination of deterministic functions weighted by random variables. It then proceeds to develop the Karhunen-Loeve expansion for vector random variables and then random processes.
Section 7.4 now contains a separate section on prediction and the Levinson algorithm.
Finally, Section 7.5 presents a discussion of the Kalman filter to complement the Wiener filter introduced in Section 7.4. Acknowledgments
--This text refers to an out of print or unavailable edition of this title.