Hi there,
I am using a 13192-EVB board from Freescale. It has a MC9S08GT60 MCU. I am trying to get the DS1822 temp. sensor to communicate with the MC9S08GT60. I keep getting a temperature value of -255 all the time. Was wondering if someone can help me troubleshoot the issues I'm having. I'm using a 4.7 k pullup resistor and I am providing approx. 5V to power the DS1822. I am connecting a wire from pin 2 of the DS1822 to one of the IO pins (J107) on the 13192-EVB boards. I have pasted the code below to see if someone can help me. I am not sure if I have the bus speed configured correctly. I was using the Device Initialization Expert and I enabled the FLL clock to try to manually set the internal bus frequency to approx. 8 Mhz (7.776 Mhz is the closest I could get). I think that is what the code requires according to the delay function that I am using, but I could be wrong. Also in the device initialization expert I enabled PTA0 and PTA1 for input and output respectively with pullup enabled. The problem is when I compile the code and run it - I keep getting a uiTemperature value of -255. I don't know what the problem is. It seems to me like the DS1822 is just not communicating properly with the MCU and that the timing (delays) must be wrong. Would really appreciate all of your thoughts on this. Thanks.
Here are the definitions:
#define TRUE 1
#define FALSE 0
#ifndef PORTS_DQ
#define PORTS_DQ 1
#define DQ_RX PTAD_PTAD0
#define DQ_TX PTAD_PTAD1
#define DIR_DQRX 0
#define DIR_DQTX 1
#endif
Solved! Go to Solution.
Hello,
MMG135 wrote:Since I am just using Self Clocked Mode right now to obtain the 4 Mhz bus frequency. However, when running the code in Codewarrior I see that the actual frequency is around 3.7 Mhz when I am using the 13192-SARD boards and around 4.2 Mhz when I am using the 13192-EVB boards. So I'm guessing I would have to use the trim register in order to get this number as close to 4 Mhz as possible thereby ensuring that the delays are accurate. I'm assuming that the reason I'm not getting the 0x0550 (85 C) returned is due to slight timing issues. Would you be able to give me a rough idea of how to use the trim register. I understand from reading the manual that if my bus frequency is lower than expected(ie lower than 4Mhz), I would have to make the binary value of the ICGTRM lower and if the value of the bus frequency at compilation is higher than 4 Mhz, I must make the binary value in ICGTRM higher. Would I just determine the value in the ICGTRM register by trial and error or is there a certain formula? I'm assuming once I find the value I would just assign a value to ICGTRM in my main and that would take care of it? For example, ICGTRM=0x18; Am I right in assuming that I could use the ICGTRM register for either the 8 Mhz clock or the 243 khz clock? Of course, in this case I'm using the 8 Mhz clock which is divided by 2 to give me the 4 Mhz bus frequency. Thanks for your help.
The use of SCM mode is quite inappropriate for the timing accuracy that you require. I assumed that you would be using FEI mode, that requires trimming of the 243kHz reference. The trim calibration can be determined at the time of programming.
If you use an external crystal, or oscillator module ( I am not sure whether Zigbee usage mandates this accuracy), you would need to select either FEE or FBE mode. For a given crystal frequency, higher bus frequencies will be possible using FEE mode. The trim registers are not relevant to an external reference source.
Regards,
Mac
I thought I should just add a few things I just found out: I modified my delay function so that it will be a 2 us delay since I am running the internal bus clock at 4 Mhz. I adjusted all the functions to account for the 2 us delay instead of a 1 us delay. All the functions have been rewritten according to the examples on the maxim website for the DS1822 except that the timing has been modified according to my delay function and internal bus clock. I wanted to know if there was a trace option in the simulator since I think that would be handy to see if the value of uiTemperature changes or just always stays -255. I compiled the code without any DS1822 sensor connected and the uiTemperature value still came back as -255! So I think either I'm not tracing the uiTemperature value properly or something is wrong somewhere else. I have attached the modified code as well and hopefully it's a bit easier to read than the last one. Any comments would be greatly appreciated since I am new to Freescale products. I am probably missing something very fundamental but I just can't figure out what it is. Thanks for the help.
Hello,
Your use of the delay_2us() function, particularly within read_bit(), makes use of quite low delay values, but seems not to have taken into account the overhead cycles required to place the parameter into the H:X register and call the function, and to eventually exit the function. I am guessing that this overhead will be at least 19 cycles, or 4.8 us with a 4MHz bus frequency. The overhead would need to be compensated within the parameter value, and there will be a minimum allowable delay of 6.8us (19 + 8 cycles).
Perhaps you should confirm the actual delay overhead using full chip simulation, which provides a cycle counter. If you need delays of less than the minimum delay, these would need to be generated by other means.
Note that during the critical bit timing periods, interrupts will need to be globally disabled.
Regards,
Mac
Hello,
For very short delays, the overheads previously discussed may be much reduced by using a suitable macro, rather than a function. Here is a very simple one, capable of generating delays of 2-256 us, in 1 us steps, for a 4 MHz bus. If a 8 MHz bus frequency was chosen, the delay range would reduce to 1-128us.
#define FBUS 4000000#define FBUSK FBUS / 1000// Allowable delay range: 2-256 us for 4MHz bus// 1-128 us for 8MHz bus#pragma NO_STRING_CONSTR#define DELAYUS(x) __asm ldx #((x) * FBUSK / 4000 - 1); \ __asm nop; __asm nop; \ __asm lp##i: dbnzx lp##i
For example, to generate a 10us delay, the macro would be called as follows:
DELAYUS(10);
and the following assembly code would be substituted:
ldx #9 ;[2]
nop ;[1]
nop ;[1]
dbnzx *+0 ;[4]
.
for a cycle count of 4 + 9 * 4 = 40 cycles
Regards,
Mac
Regarding my last post: I suppose I could use a macro for smaller time delays(under 128 us) and the asm function for longer time delays(greater than 128 us)? Would that be the best way to approach it as opposed to sequential delays using the macro? Also you mentioned that I should globally disable interrupts while performing the time-sensitive read/write functions. What would be the best way to do this for MC9S08GT60 MCU? I suppose the most time-sensitive functions would be the read_bit, write_bit, reset functions and possibly the read_byte and write_byte functions? Would I have to disable interrupts at the beginning of these functions before they execute or what would be the best way to do this? Thanks for your help I really appreciate it.
Here is the asm generated delay and I am generating an 8 Mhz bus speed to use with this. But as you mentioned, due to the overhead cycles this still will not give me exactly 1 us delay.
void delay_1us(unsigned int n)
{
asm {
LDHX (n)
LOOP1: AIX #-1 ;[2]
CPHX #0 ;[3]
BNE LOOP1 ;[3]
}
}
Here are the time-sensitive read/write functions:
char read_bit(void){
char i;
DQ_TX=0; //
delay_1us(1);
DQ_TX=1;
delay_1us(16);//delay for 16 us
i=DQ_RX;
delay_1us(44);//delay for 44 us
return i;
}
void write_bit(char bitval) {
DQ_TX=0;
if(bitval==1) DQ_TX=1;
delay_1us(104);
DQ_TX=1;
}
unsigned char read_byte(void){
unsigned char i;
unsigned char value=0;
for(i=0; i<8; i++) {
if(read_bit()) value|=0x01<<i;
delay_1us(120);
}
return(value);
}
void write_byte(char val) {
unsigned char i;
unsigned char temp;
for(i=0; i<8; i++) {
temp=val>>i;
temp &=0x01;
write_bit(temp);
}
delay_1us(104);
}
Another question I had: For the conversion function(0x44) on the DS1822: the datasheet states that for 12-bit resolution, the conversion takes 750 ms! So I would not be able to use any of these uc delays or else I would require 750,000 of them. How would you recommend creating a millisecond delay for this purpose. DELAY1MS for example. Thanks for all your help.
If you want an alternative to the inline delay method, whcih stalls all other processing you could use the RTC.
void InitRTCInternalClock(void){ // Set RTIE to enable interrupts, select the 1KHz internal oscillator // set the divider to 1 for a 1ms interrupt. RTCSC = 0x18; RTCMOD = 0;}long ms_counter; interrupt VectorNumber_Vrtc void RTC_InterruptHandler(void){ ++ms_counter; RTCSC |= 0x80; // Ack the interrupt}
Hi Jim,
Thanks for that post. Could you give me an example of how to use these functions. Would I have to call the function to initialize the clock and then would I have to call another function to specify how many microseconds I want to delay for? I have learned that timing is very critical for these 1-wire devices so perhaps a delay using a clock would be preferred since I presume there would be less if any overhead as with the inline delays? Thanks for the help.
Hello,
MMG135 wrote:Thanks for the reply. Would I be able to use that delay you mentioned sequentially. For example, if I am using an 8 Mhz bus and I want a 470 us delay, can I write the following statements, or will there be a significant overhead delay if I run delays sequencially like this:
DELAYUS(128);
DELAYUS(128);
DELAYUS(128);
DELAYUS(86);
The macro may be cascaded, as you have shown, however keep in mind that, for each macro usage, six bytes of code are substituted for the macro. This may not be the most efficient method.
The macro itself already takes into account the setup overhead, and actually this is padded out so that it exactly equals the cycles per loop (4 cycles), and is the reason for subtracting 1 within the delay setup expression.
MMG135 wrote:Regarding my last post: I suppose I could use a macro for smaller time delays(under 128 us) and the asm function for longer time delays(greater than 128 us)? Would that be the best way to approach it as opposed to sequential delays using the macro? Also you mentioned that I should globally disable interrupts while performing the time-sensitive read/write functions. What would be the best way to do this for MC9S08GT60 MCU? I suppose the most time-sensitive functions would be the read_bit, write_bit, reset functions and possibly the read_byte and write_byte functions? Would I have to disable interrupts at the beginning of these functions before they execute or what would be the best way to do this? Thanks for your help I really appreciate it..
If an interrupt should occur during the execution of a time critical function, any delay will be extended by the execution period of the associated ISR. The bit read and write functions are time critical in that the delays need to be precisely controlled. Other delay operations for which there is a minimum delay, but no specified maximum, can usually tolerate the presence of interrupts.
Macros for DisableIntrrupts; and EnableIinterrupts; are located in the header file <hidef.h>, or you can do your own thing:
#define Disable_interrupts __asm sei
#define Enable_interrupts __asm cli
.
For longer, less critical time delays, you might use your previous function, or you could make use of a TPM module. Another possibility is to use the delay macro within a loop.
#define M 97 // Corrected loop constant for 100us delay
void delay100us( word n)
{
for ( ; n; n--) {
__RESET_WATCHDOG();
DELAYUS(M);
}
}
.
You would need to experimentally determine the corrected loop constant to produce exactly 100us per loop.
To achieve a 750ms minimum delay you might then call delay100us( 7500); It would not be necessary to disable interrupts in this instance.
I have just noticed that you are using two MCU pins to interface with the one-wire device, one pin as an output and another as an input. This implies that you would use an open-collector (or drain) transistor between the MCU pin and the output connection, or alternatively would set the output pin as an input, when high-Z is required during read. If you are using the latter method, you may as well use a single pin for both write and read processes.
With either method, you will need to take into account the cycles required to setup the pin(s), and compensate within your critical delay constants.
I have also attached a couple of files. One provides some examples of various methods of generating time delays of different durations. The other files provides a relocatable (real) assembly code example of bit banging a one-wire interface. However, the code was for an earlier HC908 device family for which some of the instructions have different cycle counts, compared with the HCS08 family.
Regards,
Mac
Another thing that still puzzles me is: even if my timing is way off, I thought I should still be getting at least the default value (85 degrees C) returned. So I'm thinking since I'm getting 255 mostly for both LSB and MSB, then there must have been a problem with my IO pin assignment as you pointed out. So I think that is the first thing I have to straighten out. I think after I get that fixed, I should at least be getting the DS1822 to return it's default value (85 C) or 0x0550. Am I right in assuming this? Then after I get the default value, I can always adjust the conversion time if necessary as well as the individual timings for the read and write bit functions.
Hello,
If you are sampling each returned data bit at the wrong point in time, a result of 255 (or binary %11111111) is quite possible. I assume that the "default" value would be the value returned prior to any conversion taking place. With incorrect timing, the return of this value would be just as problematic as the return of a real termperature value.
You will not achieve meaningful results until the timing problems are fixed, in addition to the providing the correct one-wire drive conditions.
MMG135 wrote:... Would something like this be correct for using a single GPIO pin to interface with the 1-wire device? Also, if I have an external pullup resistor would I need to have a definition at the bottom for a pullup? Am I correct to assume that if I use ONE_WIRE_SET_LOW it will drive the bus low and ONE_WIRE_SET_HIGH will set the GPIO pin as an input and the external 4.7k pullup resistor will cause the bus to be pulled high?
#define ONE_WIRE_PIN PTCD_PTCD4
#define ONE_WIRE_DIR PTCDD_PTCDD4
//1-Wire pin management
#define ONE_WIRE_SET_LOW ONE_WIRE_DIR = 1; ONE_WIRE_PIN = 0; //output
#define ONE_WIRE_SET_HIGH ONE_WIRE_PIN = 0; ONE_WIRE_DIR = 0; //input
#define ONE_WIRE_SET_STRONG_PU ONE_WIRE_PIN = 1; ONE_WIRE_DIR = 1;
For the high-Z read case, there is no necessity to set the output register to low, since you have input mode. The following would suffice, which will also slightly reduce the overhead cycles:
#define ONE_WIRE_SET_HIGH ONE_WIRE_DIR = 0; // High-Z input
Note that, while the CW compiler seems to produce the most efficient code for the above operations, this may not necessarily be so for all compilers. This could result in differences for the number of overhead cycles. For the time critical bit banging, there is a more explicit alternative that might be considered.
#define OW_DPIN 4
#define OW_DPORT PTCD
#define OW_DPORTDD PTCDD
#define OW_SETLOW __asm bset OW_DPIN,OW_DPORTDD; __asm bclr OW_DPIN,OW_DPORT // 10 cycles
#define OW_SETHIGH __asm bclr OW_DPIN,OW_DPORTDD // 5 cycles
Yes, you will need an external pullup resistor - the internal pullup will not be sufficient to allow for capacitive loading of the one-wire bus.
From your original description, my understanding was that you were applying Vdd to the temperature sensor, and were not using parasitic power configuration. Is this correct? Your defines seem to suggest that this may not be the case.
Regards,
Mac
Hi Mac,
Thanks so much for that post. I will try those defines that you mentioned. It is true that I was applying 5 V to the VDD pin of the DS1822 one-wire device. I have a 4.7k pullup resistor from 5V and then going to the DQ(data pin). I have the last pin of the DS1822 grounded. Then I am running a wire from the DQ pin on the DS1822 to one of the 10 GPIO pins (J107) on the 13192-EVB board. I am trying to use the DS1822 in external powered mode. So perhaps some aspects of the code I am using were made for parasitic mode? If so, what would I change to ensure that I am not trying to use the DS1822 in parasitic mode? Thanks for the help.
Parisitic power mode would require a "strong pullup" during the conversion period. Normal mode does not require use of strong pullup condition.
Regards,
Mac
Hi Mac,
There is this problem I keep running into. When I try to switch the pins to an input during High-Z mode, I never seem to get any response from the 1-wire device. It seems the only way I can get even a 255 is by using the following definitions: I don't know why simply switching the pin between output and input seems to give me absolutely no response just all zeros. This is the configuration that gives me 255. I have posted the reset function but the read_bit and write_bit functions use a similar idea. I configure the pins as outputs initially to drive the bus high or low and then when I want to read I just force the pin to high-z (ONE_WIRE_DIR=0) and then I actually get a value (255). Although as you mentioned this still means my timing is off. Will this way of doing it work as well? Ie. for ONE_WIRE_SET_HIGH - setting the pin as an output and then driving the pin high?
//#define ONE_WIRE_PIN PTBD_PTBD1
//#define ONE_WIRE_DIR PTBDD_PTBDD1
//1-Wire pin management
//#define ONE_WIRE_SET_LOW ONE_WIRE_PIN = 0; ONE_WIRE_DIR = 1;
//#define ONE_WIRE_SET_HIGH ONE_WIRE_PIN = 1; ONE_WIRE_DIR = 1;
char ow_reset(void)
{
unsigned char presence;
ONE_WIRE_SET_LOW
ONE_WIRE_DELAY(480); //delay 470 us
ONE_WIRE_SET_HIGH
DELAYUS(70);
ONE_WIRE_DIR = 0; //set to input to read pin value
presence= ONE_WIRE_PIN;
ONE_WIRE_DELAY(420); //delay ~_ 470 us
return presence;
}
Note: if I use :
#define ONE_WIRE_SET_LOW ONE_WIRE_PIN = 0; ONE_WIRE_DIR = 1;
#define ONE_WIRE_SET_HIGH ONE_WIRE_DIR = 0;
and then use the following ow_reset() function without the line: ONE_WIRE_DIR=0; to change to input... I get no response from the DS1822 (all zeros).
char ow_reset(void)
{
unsigned char presence;
ONE_WIRE_SET_LOW
ONE_WIRE_DELAY(480); //delay 470 us
ONE_WIRE_SET_HIGH
DELAYUS(70);
presence= ONE_WIRE_PIN;
ONE_WIRE_DELAY(420); //delay ~_ 470 us
return presence;
}
Here is the function for read_bit using the same idea:
char read_bit(void){
char i;
ONE_WIRE_SET_LOW
DELAYUS(1);
ONE_WIRE_SET_HIGH
ONE_WIRE_DELAY(10);
ONE_WIRE_DIR=0;
i=ONE_WIRE_PIN;
return i;
}
Of course, I could be doing something fundamentally wrong again though. Is it possible that the 13192-EVB development board IO pins function differently than just the MCU by itself? Or should the J107 pins on the development board function just as if I was connecting directly to the MCU itself via any of the IO pins? This development board is a Zigbee capable board which has a transmitter/receiver built in. I don't know if that would make any difference whatsoever. However, I did use one of the demo applications which included some of the Zigbee functionality but I just commented out all references to this in the main except for the serial port code which I am using to output my temperature readings to the serial port and reading them on HyperTerminal. Is there any chance that the IO pins I am using are still being used by the zigbee application when the MCU is initialized in the application? Even though, I have commented out everything except the serial code in the main? I will paste my code to show you what I mean. It compiles just fine. The only reason I used this template was because I eventually plan to send the temperature information from the DS1822 wirelessly via zigbee to another board and I thought it would be handy to have the header files etc in the project already to use in the future. Also, it already had serial code in there to use so I can output the temperatures serially to check via Hyperterminal to see what kind of temperature values I'm getting.
Hello,
Have you actually observed the waveform on the one-wire bus with an oscilloscope, to see if the timing is reasonable. This may also reveal if there are any pin conflict issues. I think I can see a potential problem with your use of the DELAYUS() macro, within the following code.
char read_bit(void){
char i;
DisableInterrupts;
ONE_WIRE_SET_LOW
DELAYUS(1);
ONE_WIRE_SET_HIGH
ONE_WIRE_DELAY(10);
ONE_WIRE_DIR=0; // This would appear unnecessary
i=ONE_WIRE_PIN;
EnableInterrupts;
// You will need an additional delay here, say 48us, to reach the end of the bit period
return i;
}
.
This means that, with a 4 MHz bus, the X value will commence to decrement from zero, giving a pulse width of more than 1024 cycles. I wonder if this is the problem?
In this instance, it would appear better to pad out the ONE_WIRE_SET_LOW macro with additional NOPs, so that the output will remain low for a minimum of 1us for the highest bus frequency, to meet the requirements of the device.
I subsequently found a problem with the DELAYUS macro. The pragma is present to allow the use of immediate addressing within the inline code, but this is incompatible with the label format that allows multiple use of the macro within a single function. This consequently produced compile errors, which I presume you discovered. I solved this problem by calling a function/subroutine within the macro. This does increase the initial overhead, but this is now acceptable with a minimum delay requirement of 10us (with padding for the 1us requirement).
Further problems with your read_bit() code are that you are not globally disabling interrupts during the critical timing, and you are not waiting until the end of the bit period before exiting the function. The total bit period has a minimum of 60 microseconds, and you will be potentially attempting to read the next bit within this period.
You will also need to ascertain whether the global disabling of interrupts for the periods necessary for one-wire operation, will give problems for Zigbee operation.
Have you noticed that reading a bit is very similar to writing a one. It is therefore possible to use a single function for both read and write operations. Using this concept, to read a byte value, you would write a dummy byte value of 0xFF.
I have attached some code that uses this approach. For code simplicity, I am also using the carry flag (CF) to communicate with the function. This is possible, in this instance, because of the extensive use of inline assembly, which ensures that the compilation process does not corrupt the flag state.
The following code snippet shows the usage of the one-wire functions, based on my interpretation of the datasheet for the DS1822. I have not actually tested a device.
#include <hidef.h> /* EnableInterrupts macro */#include "derivative.h" /* peripheral declarations */#include "One-wire.h"#include "DS1822.h"// Global variables:SCRATCHPAD spad;...... // Start temperature conversion: if (OW_reset()) { (void)OW_byteproc( SKIP_ROM); (void)OW_byteproc( CONVERT_T); // Wait for conversion complete while (OW_byteproc( DUMMY) == 0x00) __RESET_WATCHDOG(); } // Get results from scratchpad: if (OW_reset()) { (void)OW_byteproc( SKIP_ROM); (void)OW_byteproc( READ_SPAD); // Read scratchpad data for (i = 0; i < 9; i++) { spad.array[i] = OW_byteproc( DUMMY); } }
Note that the RTC module is not suited to timing intervals of tens of microseconds.
Regards,
Mac
I just thought of another reason that it may not be working properly. I thought I could use the method of driving the bus low by setting the GPIO pin as an output and then outputting a zero and driving the bus high by setting the GPIO pin as an input (high-z). However, I have read that this may not work unless the the GPIO is configured as open-drain or open-collector. I am not sure if the GPIO pins on the 13192-EVB board are open drain or not. Is there a way I can find this out? If they are not open drain, I am guessing this would mean that I have to use your second approach where I have a transistor between the MCU and the output of the DS1822. Let me know if I am correct in assuming this. Perhaps this is another possible reason why I am getting all zeros when I try to configure the device pins this way, i.e: setting GPIO pin as input to drive bus high. If this is the case, what would be the best way to modify the pin assignments to correct this?
ie: If this is the case, should I then use two GPIO pins instead of 1 and should I make any other changes to code besides this. Is it still possible to use one GPIO pin if I use a NPN transistor between MCU and DS1822 (DQ pin). Would the connection be:
Base of Transistor to GPIO pin, Collector to DQ pin and Emitter to Ground.
Note: I did a little bit more reading and realized that most applications use a simulated open drain. So I guess that would not be the problem then. Are there certain GPIO pins on the GT60 that would provide lower or high sink current though. My understanding is that if so, I should choose the GPIO pin which provides a higher sink current. I would really prefer not to have to use a transistor if I don`t have to since I want to start off as simple as possible. I just want to be able to take basic temperature measurements from the DS1822.
Hi Mac,
I will let you know if the code works when I get the replacement DS1822. Thanks once again for all your help. I really appreciate it. I suspect the one I had may have burned out. In the meantime, is there anyone with a DS1822 who may be able to test the code? If so, I would greatly appreciate it. Also, in the interim, I'm working on a LM35 analog sensor and integrating it with the analog to digital converter on the MC9S08GT60 MCU.
Hi Mac,
Thanks so much for that posting. It explains everything very well. So that code you posted was in main() right? Then I would just be able to take the bytes from spad.array[0] and spad.array[1] and those would be my LSB and MSB of the temperature right? I will try to compile and run this and see if it works. Thanks again for your help.