View previous topic :: View next topic |
Author |
Message |
mapey87
Joined: 24 Mar 2010 Posts: 2
|
Problem in PIC16F877A |
Posted: Wed Mar 24, 2010 5:33 pm |
|
|
Anyone can help me? This is my program. I don't know what's wrong with it.
Code: |
#include <16F877.h>
#device ADC=16
#fuses HS,NOWDT,NOPROTECT,NOLVP
#use delay(clock=20000000)
#use rs232(baud=9600, xmit=PIN_C6, rcv=PIN_C7)
void main ()
{
unsigned int16 Range,total, i;
setup_port_a( ALL_ANALOG );
setup_adc( ADC_CLOCK_INTERNAL );
set_adc_channel( 0 );
printf("Wetness value\n\r");
do{
i = Read_ADC();
Range=i*(6275/10000.0);
printf("Value = %ld\n\r",i);
delay_ms(1000);
}while(true);
}
|
|
|
|
PCM programmer
Joined: 06 Sep 2003 Posts: 21708
|
|
Posted: Wed Mar 24, 2010 6:24 pm |
|
|
Why are you using ADC=16 ? That will cause the ADC result to be
left-justified. Is that what you want ? Normally, people want the
result to be right-justified.
This will give you a 10-bit result, and it will be right-justified:
|
|
|
mapey87
Joined: 24 Mar 2010 Posts: 2
|
|
Posted: Wed Mar 24, 2010 8:25 pm |
|
|
I already change it. But the output I get is 163....The output that I want is from 0-15...or should I use float instead of int16? |
|
|
Ttelmah
Joined: 11 Mar 2010 Posts: 19535
|
|
Posted: Thu Mar 25, 2010 3:09 am |
|
|
First, as a comment, don't use ADC_CLOCK_INTERNAL. This is only designed for use, with clock rates below 1MHz, or if you put the processor to sleep, when performing the conversion. The 'correct' conversion clock for your 20MHz master clock, is:
ADC_CLOCK_DIV_32
This affects the accuracy slightly.
Now, your maths (6275/10000), effectively multiplies the ADC value by 0.6275. The ADC value is 0 to 1023, so it'll give numbers from 0 to 641. Nowhere near what you want....
It saves significant time to use integer arithmetic where possible, so something like:
Code: |
i = Read_ADC();
Range=i*16; //Will give 0 to 16368
printf("Value = %5.2lw\n\r"); //will print this as 0.00 to 16.36
delay_ms(1000);
|
Best Wishes |
|
|
|