Hello there. I have a custom board with lcd of 24bit color but the 52277 only supports up to 18bpp. I ran the egui demo project (with no modification yet so there might be timing, etc problems) and there is no display other than the backlight. I wonder if the 24bit thing is a road blocker or i just loose a few bits in the color? if the former then i'll have to switch to another lcd, if the later i'll live with it.
Any suggestion is appreciated.
Regards
Leong
You don't want loose bits, they might end up anywhere! :smileyhappy: Just tie the two LSBits for each of the RGB inputs to ground.
Consider the memory taken up to be the frame buffer, or frame buffers. In our application we need four full sized buffers. 18BPP requires 32 bits of storage per pixel . If you can get away with 16 bits (5-6-5) then halving the amount of memory and the bus bandwidth might be worthwhile if not essential.
If you want a 640 by 480 panel, that's 1.2 Megabytes per buffer at 32 bits per pixel. It may also take up more than 100% of your memory bus. If you're only using a QVGA or smaller panel then it won't be quite so bad.
Check out AN3606 for details.
Tom
Thanks for the hint Tom,
I checked the schematic and it does show the lsbs for each color were tied to gnd. so i guess i'm tied to a 18bpp config unless i change the hardware config for the next rev. the panel i'm using is Formike kwh050tg08-f02, which is 800x480 (http://www.displayfuture.com/engineering/specs/TFT/KWH050TG08-F02.pdf). There is a 256MB sdram on the board. I'll check the notes.
Much appreciated.
Leong
> there is no display
The driver software you're using may have assumed that the display pins had already been programmed from their default of being GPIO pins to being LCDC pins. That means PAR_LCDL, PAR_LCDH and others. Check the signals with a CRO and make sure you have LCDC clock, data and sync signals configured for LCDC operation and generating signals.
Read the LCD_ISR. If you're getting the UDR bit set then the controller is trying to read more than your SRAM can supply. You'll need to optimise the SDRAM timing, the crossbar and optimise the FIFO timing. If this is happening, just run at a slower pixel clock for a while. You shouldn't see the ERR bit. One of the BOF or EOF bits should be setting. If they're not, then the LCDC isn't running.
> 800x480
The Data Sheet indicates 60Hz refresh with a 30MHz pixel clock. You'll have to run it at 26.6MHz (assuming your bus clock is 80MHz). At 32 bits/pixel that's a burst memory read rate of 32 bits at 26.6MHz or 106MB/s. That's 83% of the memory bandwidth quoted in AN3606. Apart from really pushing it, you can only get that rate if everything else is tuned up perfectly.
The LCDC has to have the highest priority in the crossbar. This is not the default. You'll have to program that if you haven't already.
You may need to program the Burst Configuraiton register (BCR) to 0x3ff. In the MCF5329 which we use this is documented to enable bursting from the USB modules and the LCDC. In the MCF52277 this only mentions the USB module. That is either documentation error (and it is required for the LCDC), the LCDC might burst all the time, or maybe it can't burst at all. You'll probably need to analyse the SDRAM bus signals while changing to registers to reverse-engineer how this works like I did.
Yo'll have to program the LCDC_DCR. Ignore the suggestion to use "automatic bursting" as it is less efficient than using fixed bursts. I recommend the following:
MCF_LCDC_LDCR = MCF_LCDC_LDCR_BURST | MCF_LCDC_LDCR_HM(0x14) | /* Burst-copy 20 at */ MCF_LCDC_LDCR_TM(0x0c); /* Low water mark 12 */
With the LCDC taking over 80% of your memory bandwidth you've only got 20% left for the CPU. That's going to hurt. Do you have any other peripherals that need access, like EDMA or USB?
If you're only using the panels for static or slowly changing displays, then try running it at a slower clock with a slower refresh rate like 10Hz if the panel allows it (the panel Data Sheet doesn't give a minimum clock, so you should be OK). if you want to display movies or animation then you might need to change to a different chip.
Tom
Thank you Tom, that is great information.
this is an instrument cluster application with some gauges and dials and numbers, so i guess the update rate of 20hz is good enough. I am a control engineer who worked on a lot of motor control (TI c2000) stuff but this is the first lcd/freescale project i'm putting hands on.
The board is rev0 derived from the 52277evb schematic with some cuts and jumps even on the ram chip so i guess i should verify the sdram works or not first
One mystery for me is how to read the datasheet to figure out the 'timing' information required by the setup. The code i got is from the eGUI demo for M52277evb with cw7.2, the code is specifically setup for the eval board and the LCD on the eval board. the code base is pretty huge and I'm still trying to digest even where the core cpu PLL is set if there is any (the oscillator is 16mhz) and where to set the 26.67mhz divider.
Thanks a lot for your time
Regards
Leong
hmmm... with a global search i couldn't find any where in the project that the pll register PCR is set ...
http://www.freescale.com/webapp/sps/site/prod_summary.jsp?code=EGUI&fpsp=1&tab=Design_Tools_Tab
From the Reference Manual - read "7.2.1 PLL Control Register (PCR)".
Read the note:
Note: The reset values of PFDR and OUTDIV5 depend on the boot configuration mode. When
BOOTMOD[1:0] = 00, 01, or 10, PFDR resets to 0x1E and OUTDIV resets to 0x7. If
BOOTMOD[1:0] = 11, PFDR and OUTDIV5 reset to the value of SBF_RCON[7:0] and
SBF_RCON[9:8], respectively, specified by serial boot.
So how is your board set up? Is it using the "Reset Cofiguration Override" or is it running with the defaults? Read "9.3.1 Chip Configuration Register (CCR)" and the following sections. You're probably using normal "oscillator mode" so that means that PCR takes the default values, documented in "7.2.1 PLL Control Register (PCR)".
The default is PFDR=0x1E and OUTDIV5=7. That means:
Fvco = 16MHz by 30 or 480MHz.
Fusb = 480MHz/(7+1) = 60MHz.
Fsdramc = Fsys = 480MHz/(2 + 1) = 160MHz
Fsys2 = 80MHz
Fsys = 480MHz/(2+1) = 160MHz.
It is very likely that nothing changes the PCR register because it default to the right value for your board and crystal already.
We're using an MCF5329 and it defaults to the same speed as the above - which is NOT the way we want it. We have to fiddle around with the clocking registers to put it into "Limp" mode, switch to 240MHz and then switch back to the PLL.
The other thing to watch out for is that when you're developing software and loading it with the debugger, the first thing the pod does is to run a "startup script" specific to your CPU and board, and it optionally writes to the clock registers, then sets up the chip selects, the SDRAM controller and perhaps a bunch of other things before loading your code. All of this happens "invisibly" and before the startup code. So often what you can find as the "startup code" doesn't do these essential things. When you finally want to burn code to FLASH to run standalone, some startup code has to do everything that the debugger was doing. A good development system provides this code for you. With a "bare bones" system like we're using we had to write all of this ourselves. Often all the "startup" is done be a separate bootstrap which sets up the hardware and then loads the "Application, which is the same as the one you run under the debugger and which doesn't have any (or much) hardware setup code in it.
Your LCDC Clock is generated from the PCD field in the LCD_PCR. This is Fsys2 divided ny (PCD+1). The maximum pixel frequency is Fsys2/3 or 26.66MHz. Don't run at that speed or you won't have any bandwidth left for your CPU to do anything else.
> this is an instrument cluster application with some gauges and dials and numbers
Animating the needles is challenging on this chip, if not "courageous, Minister".
Been there, coded that, but on a 240MHz MCF5329 (50% aster than your one):
The above is an excellent web-based emulation of the product. Click on "Gauges" and then click on the little left and right arrows on each side of the Gauge Number display to cycle through the 11 gauges. Then click on the "Menu" button at lower left and then select and play with the Stopwatch. Check the G-Force graphics.
Check your "Private Messages" for more details. That's the little envelope near the "Search" button on this page that should be showing Yellow at the moment.
Tom
Search for "LCDC" in this forum. There are three pages of hits. A lot of them are by me on problems with the LCDC. I've been assured that the MCF52277 doesn't have the MCF5329 fatal problem of locking up solid when you try to disable the controller, but it has all the other restrictions.
You should "triple buffer" and trigger your frame updates from the vertical interrupts from the LCDC. You should also reload the LCD_SSAR (to switch to a new display buffer) in that interrupt routine, but note the frame won't actually switch until just before the NEXT interrupt, so that's why you have to be triple-buffered. The one being displayed, the one you told it to switch to on the last interrupt and the one your code is writing to. Read the following:
https://community.freescale.com/message/61205#61205
You should interrupt on one of the BOF interrupts and not EOF (and certainly not EOF Loading Last Frame) as the relative timing between loading LCD_SSAR and the EOF interrupts is "unknown", and I spent 19 months trying to find out. That was because I wanted to stay double-buffered. Not impossible, but it took two years to get right.
You might even be better off feeding the Vertical Sync output back to an interrupt pin - that's the only way to get a known good LCDC interrupt timing on this chip.
Tom