Thank you very much for your help.
I'm using MKM34Z256VLL7 and I'm having trouble with the SAR-ADC initialization.
The source is shown below.
On line 21, set clockSource to kADC16_ClockSourceAlt2, BUSCLOCK/2 =20.97152MHz/2 =10.48576MHz.
(clockDivider = kADC16_ClockDivider8)
Then, in ADC16_DoAutoCalibration(), kADC16_ChannelConversionDoneFlag does not stand, so it becomes an infinite loop.
Please let me know why clockSource = kADC16_ClockSourceAlt2 is not good.
I read in the datasheet that the ADC CONVERSION CLOCK should be less than 18MHz.
---
void adcInitialize(void)
{
adc16_config_t config;
/*
* config->referenceVoltageSource = kADC16_ReferenceVoltageSourceVref;
* config->clockSource = kADC16_ClockSourceAsynchronousClock;
* config->enableAsynchronousClock = true;
* config->clockDivider = kADC16_ClockDivider8;
* config->resolution = kADC16_ResolutionSE12Bit;
* config->longSampleMode = kADC16_LongSampleDisabled;
* config->enableHighSpeed = false;
* config->enableLowPower = false;
* config->enableContinuousConversion = false;
* config->hardwareAverageMode = kADC16_HardwareAverageDisabled;
*/
ADC16_GetDefaultConfig(&config);
config.referenceVoltageSource = kADC16_ReferenceVoltageSourceValt;
config.clockSource = kADC16_ClockSourceAlt2; // BUSCLOCK/2 =20.97152MHz/2 =10.48576MHz
config.longSampleMode = kADC16_LongSampleCycle24;
config.hardwareAverageMode = kADC16_HardwareAverageCount32;
ADC16_Init(ADC0, &config);
ADC16_EnableHardwareTrigger(ADC0, false);
ADC16_SetHardwareCompareConfig(ADC0, NULL);
if ((status_t)kStatus_Success == ADC16_DoAutoCalibration(ADC0)) {
The following is omitted.
thank you.
已解决! 转到解答。