Department of Electrical and Computer Engineering, University of Manitoba, 75, Chancellor’s circle, Winnipeg, R3T 5V6, Canada

Abstract

Separate source and channel coding is known to be sub-optimal for communicating correlated sources over a Gaussian multiple access channel (GMAC). This paper presents an approach to designing distributed joint source-channel (DJSC) codes for encoding correlated binary sources over a two-user GMAC, using systematic irregular low-density parity check (LDPC) codes. The degree profile defining the LDPC code is optimized for the joint source probabilities using extrinsic information transfer (EXIT) analysis and linear programming. A key issue addressed is the Gaussian modeling of log-likelihood ratios (LLRs) generated by nodes representing the joint source probabilities in the combined factor graph of the two LDPC codes, referred to as source-channel factor (SCF) nodes. It is shown that the analytical expressions based on additive combining of incoming LLRs, as done in variable nodes and parity check nodes of the graph of a single LDPC code, cannot be used with SCF nodes. To this end, we propose a numerical approach based on Monte-Carlo simulations to fit a Gaussian density to outgoing LLRs from the SCF nodes, which makes the EXIT analysis of the joint decoder tractable. Experimental results are presented which show that LDPC codes designed with the proposed approach outperforms previously reported DJSC codes for GMAC. Furthermore, they demonstrate that when the sources are strongly dependent, the proposed DJSC codes can achieve code rates higher than the theoretical upper-bound for separate source and channel coding.

1 Introduction

Wireless communication of multiple correlated information sources to a common receiver has become an important research problem due to potential applications in emerging information gathering systems such as wireless sensor networks (WSNs)

DJSC coding of correlated sources for a GMAC has been studied sparsely in the literature. While there is no known tractable way to optimize a DJSC code for a given set of correlated sources and a MAC, a sub-optimal but effective and tractable framework is to encode each source using an independent channel code in such a manner that the resulting dependence between the MAC input codewords can be exploited by a joint decoder

In contrast to a previous work, in this paper, we present a DJSC code design approach for a pair of correlated binary sources, in which the degree profile of a systematic irregular LDPC (SI-LDPC) code is optimized for the joint distribution of the two sources and the signal-to-noise ratio (SNR) of the GMAC. Our motivations for using SI-LDPC codes are the following: (1) systematic codes can be used to exploit inter-source correlation in joint decoding of the two codes, (2) LDPC codes can be optimized by linear programming, in conjunction with the EXIT analysis of the belief propagation (BP)-based joint decoder, and (3) LDPC codes are known to be capacity achieving for a single-user case

This paper is organized as follows: Section 2 formulates the DJSC code design problem addressed in this paper and the code optimization procedure is presented in Section 3. Section 4 studies the problem of modeling the pdf of outgoing LLRs from SCF nodes and presents a numerical method for computing the mutual information of these messages in the EXIT analysis. Section 5 presents and discusses the simulation results. Conclusions are given in Section 6.

2 Problem setup

A block diagram of the system under consideration is shown in Figure
_{1} and _{2} be two dependent, uniform binary sources. Let the dependence between the two sources be described by the _{1} ≠ _{2}) = _{
k
}, _{
k
} ∈ {+1,-1} in equivalent base-band representation. The output of the GMAC

where
^{2}. In general, the maximum sum rate achievable over GMAC for two dependent sources is not known. However, when the sources are independent, i.e. _{1},_{2},_{1},_{2},_{1},_{2}) = _{1},_{2}). In Figure
_{1},_{2},^{2} as a function of ^{2}. Notice that the maximum of _{1},_{2},^{2}), which shows that optimizing the codes for the values of these parameters will result in higher sum rates for dependent sources over the same GMAC, compared to independent sources.

Block diagram of the system considered in this paper

**Block diagram of the system considered in this paper.**

_{1}_{2}^{2}

_{1}_{2}**) as a function of inter-source correlation parameter** ** for different values of GMAC noise variance** ^{2}**.**

The optimal DJSC code for the given GMAC must induce a distribution _{1},_{2}) which maximizes the sum rate of the two channel inputs
_{1},_{2}) = _{1},_{2}) for systematic bits of the channel input codewords (the parity bits of each source are related to information bit as given by parity-check equations of the code
_{1},_{2}) of the information bits of the two channel input codewords provide an additional joint decoding gain and hence the number of parity bits required for encoding each source is reduced, or equivalently the achievable sum rate is higher. With practical (finite length) channel codes, this implies that the same decoding error probability can be achieved at a higher sum rate. Note that by construction, the aforementioned DJSC coding scheme requires that the code length _{
c
} =

The code design approach presented in this paper is based on systematic irregular LDPC (SI-LDPC) codes. First, consider an **H** can be represented by a factor graph with code bit variable (CBV) nodes _{
i
} (resp. _{
i
}) is the fraction of edges connected to CBV (resp. PCF) nodes of degree _{cmax} and _{vmax} are typically chosen in such a manner that the sparsity of the corresponding factor graph is maintained (i.e., the edges in the factor graph grow linearly with the codeword length
^{
s-2} + (1 - ^{
s
} for some

Now consider, a two-input GMAC with an SI-LDPC code applied to each input. Since (1) is symmetric with respect to _{1} and _{2} and the same rate is used for both sources, the same channel code can be used for both sources. While the parity check matrix **H** of each SI-LDPC code whose code bits are _{
k
}(1),…,_{
k
}(_{1}(_{2}(_{
k
}(

Combined factor-graph used for joint decoding of the two LDPC codes

**Combined factor-graph used for joint decoding of the two LDPC codes.** Bit remapping is used to convert the systematic channel code _{k}(1),…,_{k}(

Then, it is easy to verify that the MAP decoded value of the

where

In the factor graph, representation of (3), each factor node represents a term in the product
_{
i
}(·), _{1}(_{2}(_{1}(_{2}(_{1},_{2}) of the source bits. For parity bits of an LDPC code (which has a dense generator check matrix), it can be assumed that _{1}(_{2}(_{1}(_{2}(_{1}(_{2}(

Sparse parity check matrices obtained through the EXIT analysis design procedure does not necessarily correspond to systematic generator matrices. As usual, the codes can be converted to systematic form by using Gaussian elimination. However, the resulting codes have dense parity-check matrices which makes the computational complexity of BP decoding impractically high. In order to get around this problem, a

3 Code optimization

A well-known simple method for constructing a near-capacity achieving SI-LDPC code for a single-input AWGN channel with noise variance ^{2} and some fixed _{
i
} which maximize the rate of the code under BP decoding, subject to Gaussian approximation (GA) for the messages passed in the decoder

In optimization of an SI-LDPC code for two correlated sources to be transmitted over a GMAC, the objective is to determine the degree distribution _{
c
}, given the source correlation parameter ^{2}, and some fixed _{
k
} and the LLRs passed from the corresponding CBV nodes to the PCF nodes be

for _{
i
}. Since the objective function and the constraints are all linear in the code parameters _{
i
}, we can use a linear program to solve the problem. The rest of this section is devoted to EXIT analysis of BP decoding on the joint factor graph and the iterative computation of

The details of BP decoding algorithm and the EXIT analysis for single-user LDPC codes can be found in

In the case of LDPC codes applied to two correlated sources transmitted over a GMAC and decoded by using a combined factor graph, the mutual information updates through CBV nodes and PCF nodes can be analytically computed as in the case of single-user LDPC codes. Denote the messages passed between various nodes in the factor graph as in Figure
_{
v
} CBV node of the code

Message (LLR) flow through the

**Message (LLR) flow through the** **th SCF node in the combined factor graph (****= 1,…,****).**

and the message passed from a PCF node of degree _{
c
} on its

see (

Note however that the LLRs computed by an SCF node cannot be formed as the sum of incoming LLRs, but are determined by (4). In the Appendix, it is shown that when sources are uniformly distributed

where

4 EXIT analysis

Let the mutual information between two random variables

Let
_{vmax}. Given that all incoming messages to a CBV node are independent, the outgoing message is given by

Under the assumption that LLRs generated by a CBV node are Gaussian with mean

where

Therefore,

Let mutual information between a CBV nodes and the messages it receives from a degree

Furthermore, the average mutual information between CBV nodes and the messages passed to SCF nodes is given by

Next consider the computation of mutual information
^{2}. As will be demonstrated below,

On the other hand, if we model the pdf of

where
^{2}, then

Next, we present an approach to numerically estimate the mean values
_{2} to a SCF node ^{2}.

Histograms of outgoing messages
^{2} = 5)

**Histograms of outgoing messages**** from SCF nodes for different values of incoming message mean**** for different values**** (**^{2} **= 5).**

For approximating an arbitrary distribution by a Gaussian, the transformation-based methods are widely used, see
^{2}. To this end, we consider the following three approaches.

•

•

• _{2} are estimated from the observations.

The rationale for using these approximations can be seen from Figure
^{2}, the density of
^{2}. Furthermore, as will be shown (see Figure

Mutual information update through SCF nodes

**Mutual information update through SCF nodes.**

Mutual information update through SCF nodes

**Mutual information update through SCF nodes.**

Decoding error probability of codes designed with three approximation methods shown in Figure
^{2} = 5,

**Decoding error probability of codes designed with three approximation methods shown in Figure ****(**^{2} **= 5,****= 0****01).**

With all three approximation methods the mean value

_{1},_{2}) and ^{2}, generate

In the case of Gaussian mixture approximation, mean values _{1}, _{2} and the weights _{1},_{2} can be estimated from the sample set of

5 Simulation results

In this section, we present simulation results obtained by designing DJSC codes for a pair of uniformly distributed binary sources (whose statistical dependence is given by ^{2}.

First, we investigate the impact of the three message density approximation considered in Section 4. As evidenced by Figures
^{2} are identical to those used in Figure
^{-6}, the codeword length required with mode-matched approximation is approximately 1.7 × 10^{4} bits, while that with mean-matched approximation is approximately 3.8 × 10^{4} bits. In obtaining simulation results in the rest of this section, we have used the mode-matched approximation.

As discussed in Section 2, the capacity of a GMAC can be higher for dependent sources as compared to independent sources. Table
_{
c
} for sources with a correlation level of _{JSC}, achieved by the proposed DJSC codes as a function of ^{6} bits and a decoding error probability of 10^{-6}. The rate lower-bound for independent sources

**
σ
**

**0.3**

**0.4**

**0.5**

**0.6**

For binary sources with ^{2}. _{
c
} is the code rate,

^{9}

^{8}

^{7}

^{6}

0.284^{2}

0.3012^{2}

0.3111^{2}

0.4411^{2}

+0.0222^{4}+0.1344^{7}

+0.0982^{3}+0.1322^{10}

+0.1655^{3}+0.0786^{11}

0.0233^{17}+0.0321^{17}

+0.0977^{8}+0.1277^{19}

+0.1363^{99}

+0.0321^{16}+0.0583^{99}

+0.0682^{99}

+0.08^{99}

_{
c
}

0.5614

0.5226

0.4857

0.4528

_{
ind
}

0.5014

0.4731

0.4393

0.4113

_{
c
}

0.6328

0.6017

0.5744

0.5509

JSC rate _{JSC} (channel uses per source bit) of proposed DJSC codes

**JSC rate** _{JSC}** (channel uses per source bit) of proposed DJSC codes.** ^{2} = 0.5 and codeword length is 10^{6} bits. The theoretical lower-bound for independent sources over the same GMAC is also shown.

In order to further demonstrate the advantage of the proposed DJSC codes compared to separate source and channel coding, we next compare three different system designs which differ in terms of the use of prior information about the inter-source correlation parameter

_{1},_{2};_{design} = 0.5,_{decode} = 0.5).

_{design} = 0.5), but the actual value of _{design} = 0.5,_{decode} = _{actual}).

_{design} = _{decode} = _{actual}).

Figure
^{2} = 1 and a decoding error probability of 10^{-6}. While the rate achieved by both schemes increases as the inter-source correlation increases (_{design} = 0.5,_{decode} = 0.5) and (_{design} = 0.5,_{decode} = _{actual}), the same pair of codes have been used for all values of

Code rates (in bits per channel use) achieved by Scheme 2 and Scheme 3

**Code rates (in bits per channel use) achieved by Scheme 2 and Scheme 3.** The points correspond to ^{-6}, ^{2} = 1, and the codeword length is 10^{6} bits.

Comparison of three code-design/decoding schemes at different values of inter-source correlation (codeword length is 10^{6} bits)

**Comparison of three code-design/decoding schemes at different values of inter-source correlation (codeword length is 10**
^{
6
}
** bits).**

The performance of three code-design/decoding schemes schemes as a function codeword length, at different channel SNR values (

**The performance of three code-design/decoding schemes schemes as a function codeword length, at different channel SNR values (****= 0.2).**

It is of interest to compare the performance of the proposed LDPC code constructions with the concatenated LDGM codes reported in

Comparison of the proposed LDPC codes with LDGM codes in (Figure eight of

**Comparison of the proposed LDPC codes with LDGM codes in (Figure eight of ****);** **= 0.1).** The SNR gap refers to the difference between the SNR of the actual channel for which the code is designed and the SNR corresponding to the theoretical limit for independent sources.

While the proposed LDPC code design is aimed at DJSC coding of correlated sources over a GMAC, they can also be applied to channel coding of independent sources over a GMAC, similar to
^{-3} for all coding rates considered here. While the approach in

Channel coding of independent sources over a GMAC: comparison of proposed LDPC codes designs with those reported in (Figure three of

**Channel coding of independent sources over a GMAC: comparison of proposed LDPC codes designs with those reported in (Figure three of ****).** _{c} is the code rate.

**Code rate** **
R
**

**0.3**

**0.5**

**0.6**

0.3^{7}+0.7^{8}

0.3^{7}+0.7^{8}

0.29^{7}+0.71^{8}

0.145^{2}

0.308^{2}

0.398^{2}

+0.154^{3}+0.086^{5}

+0.121^{3}+0.176^{6}

0.021^{3}+0.179^{4}

+0.055^{11}+0.329^{48}

+0.021^{22}++0.092^{48}

+0.032^{40}++0.049^{48}

6 Conclusions

An approach to designing a DJSC code with symmetric rates for a pair of correlated binary sources transmitted over a GMAC, based on SI-LDPC codes has been developed. For EXIT analysis of the joint BP decoder for two sources, the accurate modeling of the density function of the outgoing LLRs from factor nodes in the combined factor graph of two LDPC codes, which represent the joint source probabilities and GMAC output conditional density (SCF nodes), has been investigated. While a tractable analytical expression appears difficult to obtain, a numerical method appropriate for EXIT analysis has been proposed for fitting a Gaussian or Gaussian mixture to model the density function of outgoing LLRs from SCF nodes. Experimental results are presented which show that SI-LDPC codes designed with this approach outperform previously reported DJSC codes. Furthermore, these results demonstrate that, for strongly dependent sources, the proposed DJSC code can achieve code rates higher than the theoretical upper-bound for independent sources over the same GMAC.

Appendix

Since

Also, define

and similarly

Similarly, it can be shown that

Therefore

from which (9) follows.

Competing interests

The authors declare that they have no competing interests.

Acknowledgements

This work has been supported by the National Science and Engineering Research Council (NSERC) of Canada.