I've got an STM32 with a straightforward voltage divider bringing in a battery level line to an ADC. The problem is that I'm getting a value that doesn't make a ton of sense to me. According to my scope, B_LEV (which is the divided line, going to GPIOC pin 1 / ADC1 channel 11) is 2.49V, with a VREF of 3.3V. The value I'm getting is 2148 (12bit adc), which should translate to 2148 / 4096 * 3.3 = 1.78V, which is obviously not true..
Am I screwing up on the math, or my ADC setting?
Here's the initialization and reading code:
void InitADC() {
ADC_InitTypeDef ADC_InitStructure;
RCC_ADCCLKConfig(RCC_PCLK2_Div4);
RCC_APB2PeriphClockCmd(RCC_APB2Periph_ADC1, ENABLE);
GPIO_InitTypeDef GPIO_InitStructure;
GPIO_InitStructure.GPIO_Pin = GPIO_Pin_1;
GPIO_InitStructure.GPIO_Mode = GPIO_Mode_AIN;
GPIO_Init(GPIOC, &GPIO_InitStructure);
ADC_InitStructure.ADC_Mode = ADC_Mode_Independent;
ADC_InitStructure.ADC_ScanConvMode = DISABLE;
ADC_InitStructure.ADC_ContinuousConvMode = ENABLE;
ADC_InitStructure.ADC_ExternalTrigConv = ADC_ExternalTrigConv_None;
ADC_InitStructure.ADC_DataAlign = ADC_DataAlign_Right;
ADC_InitStructure.ADC_NbrOfChannel = 1;
ADC_Init(ADC1, &ADC_InitStructure);
ADC_RegularChannelConfig(ADC1, ADC_Channel_11, 1, ADC_SampleTime_239Cycles5);
ADC_Cmd(ADC1, ENABLE);
ADC_ResetCalibration(ADC1);
while(ADC_GetResetCalibrationStatus(ADC1));
ADC_StartCalibration(ADC1);
while(ADC_GetCalibrationStatus(ADC1));
ADC_SoftwareStartConvCmd(ADC1, ENABLE);
}
int ReadBatteryValue() {
// Make sure we have conversion completion
if(ADC_GetFlagStatus(ADC1, ADC_FLAG_EOC) == RESET)
return 500;
// Reset the flag
ADC_ClearFlag(ADC1, ADC_FLAG_EOC);
// Get the conversion value
return ADC_GetConversionValue(ADC1);
}
Answer
So - after digging around some more, I realized that the problem is far simpler - I'm writing code for a third-party module that's based on an STM32 chip, and they had their GPIOs remapped. I was looking at the wrong ADC channel (sigh).
No comments:
Post a Comment