A 16-bit Analog to digital converter has an input range of ±12 V. Compute the resolution error of the converter for the analog input. If an 8-bit converter was used, how is the resolution error changed. 2. The input voltage range of an 8-bit single slope integrating analog to digital converter is ±12 V. Find the digital output for an analog input of 5 V. Express it in decimal and binary formats.