1.4: Representations of Integers in Different Bases
In this section, we show how any positive integer can be written in terms of any positive base integer expansion in a unique way. Normally we use decimal notation to represent integers, we will show how to convert an integer from decimal notation into any other positive base integer notation and vise versa. Using the decimal notation in daily life is simply better because we have ten fingers which facilitates all the mathematical operations.
Notation An integer \(a\) written in base \(b\) expansion is denoted by \((a)_b\).
Let \(b\) be a positive integer with \(b>1\). Then any positive integer \(m\) can be written uniquely as \[m=a_lb^l+a_{l-1}b^{l-1}+...+a_1b+a_0,\] where \(l\) is a positive integer, \(0\leq a_j<b\) for \(j=0,1,...,l\) and \(a_l\neq 0\).
We start by dividing \(m\) by \(b\) and we get
\[m=bq_0+a_0, \ \ \ 0\leq a_0 <b.\]
If \(q_0\neq 0\) then we continue to divide \(q_0\) by \(b\) and we get
\[q_0=bq_1+a_1, \ \ \ 0\leq a_1<b.\]
We continue this process and hence we get
\[\begin{aligned} q_1&=&bq_2+a_2, \ \ \ 0\leq a_2<b,\\ &.&\\ &.&\\ &.&\\ q_{l-2}&=&bq_{l-1}+a_{l-1}, \ \ \ 0\leq a_{l-1}<b,\\ q_{l-1}&=&b\cdot 0+a_l, \ \ \ 0\leq a_l<b.\end{aligned}\]
Note that the sequence \(q_0,q_1,...\) is a decreasing sequence of positive integers with a last term \(q_l\) that must be 0.
Now substituting the equation \(q_0=bq_1+a_1\) in \(m=bq_0+a_0\), we get \[m=b(bq_1+a_1)+a_0=b^2q_1+a_1b+a_0,\] Successively substituting the equations in \(m\), we get \[\begin{aligned} m&=&b^3q_2+a_2b^2+a_1b+a_0,\\ &.&\\ &.&\\ &.&\\ &=&b^lq_{l-1}+a_{l-1}b^{l-1}+...+a_1b+a_0,\\ &=& a_lb^l+a_{l-1}b^{l-1}+...+a_1b+a_0.\end{aligned}\] What remains to prove is that the representation is unique. Suppose now that \[m=a_lb^l+a_{l-1}b^{l-1}+...+a_1b+a_0=c_lb^l+c_{l-1}b^{l-1}+...+c_1b+c_0\] where if the number of terms is different in one expansion, we add zero coefficients to make the number of terms agree. Subtracting the two expansions, we get \[(a_l-c_l)b^l+(a_{l-1}-c_{l-1})b^{l-1}+...+(a_1-c_1)b+(a_0-c_0)=0.\] If the two expansions are different, then there exists \(0\leq j\leq l\) such that \(c_j\neq a_j\). As a result, we get \[b^j((a_l-c_l)b^{l-j}+...+(a_{j+1}-c_{j+1})b+(a_j-c_j))=0\] and since \(b\neq 0\), we get \[(a_l-c_l)b^{l-j}+...+(a_{j+1}-c_{j+1})b+(a_j-c_j)=0.\] We now get \[a_j-c_j=(a_l-c_l)b^{l-j}+...+(a_{j+1}-c_{j+1})b,\] and as a result, \(b\mid (a_j-c_j)\). Since \(0\leq a_j<b\) and \(0\leq c_j<b\), we get that \(a_j=c_j\). This is a contradiction and hence the expansion is unique.
Note that base 2 representation of integers is called binary representation. Binary representation plays a crucial role in computers. Arithmetic operations can be carried out on integers with any positive integer base but it will not be addressed in this book. We now present examples of how to convert from decimal integer representation to any other base representation and vise versa.
To find the expansion of 214 base 3:
we do the following \[\begin{aligned} 214&=&3\cdot 71+1\\ 71&=& 3\cdot 23+2\\ 23&=& 3\cdot 7+2\\ 7&=& 3\cdot 2+1\\ 2&=& 3\cdot 0+2\\\end{aligned}\] As a result, to obtain a base 3 expansion of 214, we take the remainders of divisions and we get that \((214)_{10}=(21221)_3\).
To find the base 10 expansion, i.e. the decimal expansion, of \((364)_7\):
We do the following: \(4\cdot 7^0+6\cdot 7^1+3\cdot 7^2=4+42+147=193\).
In some cases where base \(b>10\) expansion is needed, we add some characters to represent numbers greater than 9. It is known to use the alphabetic letters to denote integers greater than 9 in base b expansion for \(b>10\). For example \((46BC29)_{13}\) where \(A=10, B=11, C=12\).
To convert from one base to the other, the simplest way is to go through base 10 and then convert to the other base. There are methods that simplify conversion from one base to the other but it will not be addressed in this book.
Exercises
- Convert \((7482)_{10}\) to base 6 notation.
- Convert \((98156)_{10}\) to base 8 notation.
- Convert \((101011101)_2\) to decimal notation.
- Convert \((AB6C7D)_{16}\) to decimal notation.
- Convert \((9A0B)_{16}\) to binary notation.
Contributors and Attributions
-
Dr. Wissam Raji, Ph.D., of the American University in Beirut. His work was selected by the Saylor Foundation’s Open Textbook Challenge for public release under a Creative Commons Attribution ( CC BY ) license.