i.MX解决方案知识库

取消
显示结果 
显示  仅  | 搜索替代 
您的意思是: 

i.MX Solutions Knowledge Base

标签

讨论

排序依据:
Currently in industries from medical diagnostics and transportation to precision agriculture and entertainment, engineers are increasingly challenged to find new ways to design in greater intelligence, connectivity, and performance—while cutting costs, power consumption and size. Single Board Computers (SBC) are an ideal platform for quick and focused product design. They continue to evolve in sophistication, and the range of possibilities continues to expand as well. And as those capabilities grow, so do the choices for design engineers. But what are the factors that matter most in SBC evaluation and selection? http://www.digi.com/pdf/Design_vs_Build_web.pdf
查看全文
Measuring 60mm by 49mm, the MYC-C8MMX CPU Module is a high-performance and cost-effective ARM SoM powered by i.MX 8M Mini which is NXP's first embedded multi-core heterogeneous applications processors built using advanced 14LPC FinFET process technology. The MYC-C8MMX CPU Module provides an outstanding embedded solution for Home and Building Control, IOV, Industrial and Medical Instruments, Human Machine Interface (HMI) and more other general purpose industrial and IoT applications which require optimized power consumption while maintaining high-performance. It is a minimum system integrated with CPU, 2GB DDR4, 8GB eMMC, 32MB QSPI Flash, GigE PHY and PMIC. All controller signals are brought out through two 0.8mm pitch 100-pin Expansion Connectors. It is capable of running Linux and Android OS and provided with plenty of software resources.                         MYC-C8MMX CPU Module Top-view                                         MYC-C8MMX CPU Module Bottom-view   MYIR offers MYD-C8MMX development board for evaluating the MYC-C8MMX CPU Module, the base board has taken great media capabilities of the i.MX 8M Mini processor to provide MIPI-DSI, MIPI-CSI, LVDS interfaces and Audio In/Out ports. It also has strong communication connectivity with 2 x USB 2.0 Host ports and 1 x Micro USB 2.0 Host/Device port, Gigabit Ethernet, MicroSD card slot, USB based Mini PCIe interface for 4G LTE Module, WiFi/Bluetooth and NVMe PCIe M.2 2280 SSD Interface. MYIR can offer design services to help customize the base board according to customers’ requirements.                                                    MYD-C8MMX Development Board Top-view                                                     MYD-C8MMX Development Board Bottom-view   MYIR offers commercial and industrial grades options for CPU Modules. More information can be found at: http://www.myirtech.com/list.asp?id=617
查看全文
This document will explain Cairo setup to draw something on screen with hardware accelerates using OpenGL ES 2.0 or OpenVG.   Introduction:   As you know you can use those libraries that I mentioned (OpenGL ES and OpenVG) to draw on frame buffer with hardware accelerate on imx6q but using those libraries are a little bit hard to deal what I mean is that using OpenGL or OpenVG  is a kind of tough job but why? Let me bring an example here to clarify it, Imagine you want to draw an attitude aircraft symbol, this symbol needs some of elements to be drawn to look like a complete attitude symbol it includes: 1-Circle 2-line 3-Text 4-Triangle 5-some custom shapes for instance two L like lines that draw horizontally   If you have an experience with OpenGL specially OpenGL ES you’ll realize that drawing circle, line, triangle and so forth doesn’t a really tough job, of course drawing these primitive in OpenGL needs more lines of code in contrast with Cairo API that you can draw them with just three lines of code but the most hard job is drawing TEXT in OpenGL when you want to draw a simple text you have to deal with extra libraries like freetype,… to fetch the glyph features and then you can using atlas approach to draw text in a bitmap texture then when you need a character in your app  you can access to the character’s position in previous stored glyph in the texture, fetch and use, also you need to work with two specific OpenGL ES shaders in this case.   So I think it’s ok to use OpenGL or OpenVG to draw shapes if you are really skilled with those or if you looking for trouble! 😄 personally I prefer to use a high level API and then focus on other aspect of my application.   Compiling Cairo:   This document doesn’t intend to configure or compile Cairo, I’m sure that you can easily configure and compile it with OpenGL ES backend with YOCTO, Buildroot or any other embedded Linux distribution builders (YOCTO and Buildroot aren’t an embedded Linux distributions they can make custom one for you) even you can compile it manually.   To configure: ./configure --prefix=/home/super/Desktop/ROOTFS/MY_ROOTFS/usr --host=${CROSS_COMPILE} CFLAGS="-I/home/super/Desktop/ROOTFS/MY_ROOTFS/usr/include/ -DLINUX -DEGL_API_FB" LIBS="-L/home/super/Desktop/ROOTFS/MY_ROOTFS/usr/lib/ -lz" --enable-xlib=no --enable-egl --enable-glesv2   To compile: make     By the way you can find your suitable configuration for your own board; Cairo has a lot of options.     How to make surface for Cairo:   If you have an experience drawing shapes with Cairo you know that you need a surface from cairo_t* type to drawing function API can work on and shapes appear on the screen. To create a Cairo surface that uses OpenGL ES you have to configure EGL (EGL is an interface between Khronos rendering APIs (such as OpenGL, OpenGL ES or OpenVG) and the underlying native platform windowing system)[1] correctly and then make a Cairo surface from it.                    EGLint config_attributes[] =                 {                                                EGL_RENDERABLE_TYPE,                                                EGL_OPENGL_ES2_BIT,                                                EGL_RED_SIZE, 8,                                                EGL_GREEN_SIZE, 8,                                                EGL_BLUE_SIZE, 8,                                                EGL_ALPHA_SIZE,EGL_DONT_CARE,                                                EGL_SURFACE_TYPE,EGL_WINDOW_BIT,                                                EGL_DEPTH_SIZE, 16,                                                EGL_SAMPLES,      4,                                                EGL_NONE                 };   When you want to change OpenGL ES v 2.0 with OpenVG it’s enough that change the parameter of EGL_RENDERABLE_TYPE (that is EGL_OPENGL_ES2_BIT) to EGL_OPENVG_BIT.   The below code will appear Figure 1 on screen:     Figure 1:Simple drawing by Cairo on IMX6Q     //~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~   //======================================================================== // Name        : testCairo.cpp // Author      : Ali Sarlak // Version     : 1.0 // Copyright   : GPL // Description : EGL+Cairo GLIB //========================================================================   #include <iostream> #include <stdio.h> #include <EGL/egl.h> #include <EGL/eglext.h> #include <EGL/eglplatform.h> #include <cairo/cairo-gl.h> #include <EGL/eglvivante.h> #include <stdlib.h>     #define DISPLAY_WIDTH 640 #define DISPLAY_HEIGHT 480 using namespace std;   int main() {     printf("START\n");     printf("~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\n");     EGLContext eglContext;     EGLSurface eglSurface;     EGLBoolean resultB;       /* Get a display handle and initalize EGL */     EGLint major, minor;     EGLDisplay eglDisplay = eglGetDisplay(EGL_DEFAULT_DISPLAY);       resultB = eglInitialize(eglDisplay, &major, &minor);       EGLint config_attributes[] =     {             EGL_RENDERABLE_TYPE,             EGL_OPENGL_ES2_BIT,             EGL_RED_SIZE, 8,             EGL_GREEN_SIZE, 8,             EGL_BLUE_SIZE, 8,             EGL_ALPHA_SIZE,EGL_DONT_CARE,             EGL_SURFACE_TYPE,EGL_WINDOW_BIT,             EGL_DEPTH_SIZE, 16,             EGL_SAMPLES,      4,             EGL_NONE     };       EGLint numberConfigs = 0;     EGLConfig* matchingConfigs=NULL;       if (EGL_FALSE             == eglChooseConfig(eglDisplay, config_attributes, NULL, 0, &numberConfigs))     {         printf("eglChooseConfig EROR\n");     }     if (numberConfigs == 0)     {         printf("eglChooseConfig EROR\n");     }       printf("number of configs = %d\n", numberConfigs);     /* Allocate some space to store list of matching configs... */     matchingConfigs = (EGLConfig*) malloc(numberConfigs * sizeof(EGLConfig));       if (EGL_FALSE  == eglChooseConfig(eglDisplay, config_attributes, matchingConfigs, numberConfigs, &numberConfigs))     {         printf("eglChooseConfig EROR\n");         if(matchingConfigs!=NULL)         {             free(matchingConfigs);             matchingConfigs=NULL;         }         return -1;     }       printf("~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\n");       EGLint display_attributes[] =     {             EGL_WIDTH, DISPLAY_WIDTH,             EGL_HEIGHT, DISPLAY_HEIGHT,             EGL_NONE };       /*Window attributes*/     EGLint window_attribList[] =     {             EGL_NONE     };       EGLNativeDisplayType eglNativeDisplayType = fbGetDisplay(0);       EGLNativeWindowType eglNativeWindow = fbCreateWindow(eglNativeDisplayType,             0,             0,             DISPLAY_WIDTH,             DISPLAY_HEIGHT);       eglSurface = eglCreateWindowSurface(eglDisplay,matchingConfigs[0],eglNativeWindow,window_attribList);       if (eglSurface == EGL_NO_SURFACE)     {         printf("eglSurface = %x\n", eglGetError());     }       const EGLint attribListCtx[] =     {             // EGL_KHR_create_context is required             EGL_CONTEXT_CLIENT_VERSION, 2,             EGL_NONE     };       eglContext = eglCreateContext(eglDisplay, matchingConfigs[0], EGL_NO_CONTEXT,  attribListCtx);      //~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~     if (eglContext == EGL_NO_CONTEXT)     {         printf("eglContext = %x\n", eglGetError());         return -1;     }       cairo_device_t* cdt = cairo_egl_device_create(eglDisplay, eglContext);       eglMakeCurrent(eglDisplay, eglSurface, eglSurface, eglContext);       cairo_surface_t *surface = cairo_gl_surface_create_for_egl(cdt, eglSurface,             DISPLAY_WIDTH,DISPLAY_HEIGHT);         cairo_t *cr = nullptr;     cr = cairo_create(surface);     if(!cr)     {         printf("Wrong cairo_t!\n");         return -1;     }     //*********************************************************************************************     for (int index = 0; index < 1; ++index) {         cairo_set_source_rgb (cr, 0, 0, 0);           cairo_move_to (cr, 0, 0);         cairo_line_to (cr, 200, 200);         cairo_move_to (cr, 200, 0);         cairo_line_to (cr, 0, 200);         cairo_set_line_width (cr, 1);         cairo_stroke (cr);           cairo_rectangle (cr, 0, 0, 100,100);         cairo_set_source_rgba (cr, 1, 0, 0, 0.8);         cairo_fill (cr);          cairo_rectangle (cr, 0, 100, 100, 100);         cairo_set_source_rgba (cr, 0, 1, 0, 0.60);         cairo_fill (cr);          cairo_rectangle (cr, 100, 0, 100, 100);         cairo_set_source_rgba (cr, 0, 0, 1, 0.40);         cairo_fill (cr);          cairo_rectangle (cr, 100, 100, 100, 100);         cairo_set_source_rgba (cr, 1, 1, 0, 0.20);         cairo_fill (cr);          cairo_surface_flush(surface);         eglSwapBuffers(eglDisplay,eglSurface);     }       //to check that cairo can make the photo from the surface, png file created     cairo_status_t s = cairo_surface_write_to_png(surface, "surface.png");     //it is a photo that made by cairo [OK]     cairo_destroy(cr);      if (CAIRO_STATUS_SUCCESS == s)     {         printf("Status = OK \n");     }     else     {         printf("Status = ERROR <ERROR_CODE->%d>\n", s);     }      if(matchingConfigs!=NULL)     {         free(matchingConfigs);         matchingConfigs=NULL;     }       cairo_surface_destroy(surface);     printf("END!\n");     return 0; }     How To Be Sure That My Application Using GPU:   If you have a look at https://community.nxp.com/thread/324670 you can profile a graphical application and investigate if it uses GPU or not, also you can measure the performance and analyze the application by vAnalyzer.       According to the link I’ve mentioned that’s enough to set galcore.gpuProfiler=1 in uboot and then check the /sys/module/galcore/parameters/gpuProfiler   file (read the file by cat, vi, nano, etc.) if the output is 1 all things is done in a right way the final step is that exporting some environment variables :   export VIV_PROFILE=1 export VP_OUTPUT=sample.vpd export VP_FRAME_NUM=1000 export VP_SYNC_MODE=1   VIV_PROFILE[0,1,2,3], VP_OUTPUT[any string], VP_FRAME_NUM[1,N], VP_SYNC_MODE[0,1]   Note: VIV_PROFILE[0] Disable vProfiler (default), VIV_PROFILE [1] Enable vProfiler, VIV_PROFILE [2] Control via application call, VIV_PROFILE [3]Allows control over which frames to profile with vProfiler by VP_FRAME_START and VP_FRAME_END.     If application uses GPU smaple.vpd file will create if not there isn't any vpd file. [1] - https://www.khronos.org/egl
查看全文
Human Machine Interface (HMI) is a graphical interface between the user and the machine that allows humans to interact with machines, thus helping us effectively control equipment as well as getting real time data acquisition. Nowadays HMIs are widely used in countless sectors like electronics, entertainment, automation, industry, military, medical, etc. A user-friendly HMI can help increase productivity by having a centralized control system. The MYD-Y6ULX-CHMI Display Panel introduced by MYIR is specially designed for HMI applications which is based on NXP’s i.MX 6UL / 6ULL ARM Cortex-A7 processors. It is ready to run Linux and consists of an MYD-Y6ULX-HMI Development Board and a 7-inch capacitive LCD mounting on its top. It is delivered with necessary cable accessories including one 12V/2A power adapter with four types of conversion plugs, one power switch cable and a quick start guide to help user start to use right away when getting it out of box.                                              MYD-Y6ULX-CHMI Display Panel    MYIR also offers an add-on optional IO board MYB-Y6ULX-HMI-4GEXP for the MYD-Y6ULX-CHMI Display Panel to further extend the functionality of the panel including one more Ethernet, WiFi & BT, USB based 4G LTE Module Mini-PCIe interface, Audio and GPIOs, thus making a complete solution for HMI applications. The IO board is delivered with one WiFi antenna and one 4G antenna but 4G module is told only as an option and user can contact MYIR for details.                                                  MYB-Y6ULX-HMI-4GEXP IO Board Let’s know more about the MYD-Y6ULX-CHMI Display Panel. The MYD-Y6ULX-HMI Development Board can support DC 12V~24V power supply. It is built around the MYC-Y6ULX CPU Module which has a compact design, measuring only 37mm by 39mm. It has integrated the i.MX 6UL/6ULL processor, DDR3, NAND FLASH/EMMC and was well soldered onto the base board through its 1.0mm pitch 140-pin Stamp Hole (Castellated-Hole) Expansion Interface which is cost-effective but with high reliability and strong vibration resistance. The 7-inch LCD provided by MYIR offers 800x480 pixels display resolution with a capacitive touch screen. Separate the LCD from the MYD-Y6ULX-HMI board, we can see on the back of the board there is one LCD interface (16-bit RGB), one capacitive touch screen interface and one resistive touch screen interface. The i.MX 6UL/6ULL series processors can support maximum 1366 by 768 pixels display resolution. On the MYD-Y6ULX-HMI board, from left to right, we can see one 2-pin 3.81mm pitch phoenix connector for 12V~24V DC power input (one power switch cable was provided), one 3-wire RS232 serial port and one RS485 serial port from the 6-pin phoenix connector, one 10/100Mpbs Ethernet port, one USB Host port (Type A), one Micro USB OTG port and one TF card slot. Near the TF card slot, there is one 2.54mm 3-pin header for Debug port and RTC battery holder. On the other side of the board, there is one 8-bit parallel camera interface, buzzer and one reset button.   The MYD-Y6ULX-HMI board has two 2.0mm pitch 2*20-pin headers for IO extension. The MYB-Y6ULX-HMI-4GEXP is just an IO extension board designed by MYIR.     The MYD-Y6ULX-CHMI is ready to run Linux operating system. MYIR has built an application demo MEasy HMI to run on this platform. The MEasy HMI is a frame of human-machine interfaces which contains a local HMI based on QT5 and a Web HMI based on Python2 back end and HTML5 front-end. The dependency software includes dbus, connman and QT5 applications, python, tornado and other components. The MEasy HMI block diagram is shown as below:   The MEasy HMI uses D-Bus as the access interface for the QT application and the underlying hardware. MYIR provides a complete set of control and communication interfaces for RS232, RS485, CAN and LED and encapsulates the interface into a library for external use based on D-BUS Method and Signal. The MEasy HMI uses Connman to control network devices. Connman is a fully modular system that can be expanded by plug-in to support the management of EtherNet,    WIFI, 3G/4G, Bluetooth and other network devices.     The directory structure of MEasy HMI is shown as below.        User can get more information about the MYD-Y6ULX-CHMI from MYIR’s website: http://www.myirtech.com/list.asp?id=604
查看全文
use AdvancedToolKit1.71 earse NANDFLASH,display error ,why ?      FLASH     K9F2G08U0C-SIB0
查看全文
​ Horw, Switzerland: Toradex announced it has joined Microsoft Azure Certified for Internet of Things (IoT), ensuring customers develop and deploy IoT solutions quickly with hardware and software that has been pre-tested and verified to work with Microsoft Azure IoT services. Microsoft Azure Certified for IoT allows businesses to reach customers where they are, working with an ecosystem of devices and platforms, enabling faster time to production. Toradex offers robust and compact embedded computing solutions encompassing System on Modules (SOMs) and Customized Single Board Computers (SBC), which are used in variety of industries such as Industrial Automation, Medical, Automotive, Robotics and many more. Toradex modules are ideal for quickly creating proof-of-concepts as well as scaling seamlessly from prototypes to tens of thousands of devices without the need to redesign the embedded computer. This reduces risk and time to market. A list of certified Toradex Apalis and Colibri SOMs can be found here. Toradex will be offering a webinar on “Getting Started with Azure IoT on Devices”. Free registration can be done here. “Microsoft Azure Certified for IoT validates our ability to jumpstart customers’ IoT projects with pre-tested hardware and operating system combinations,” said Stephan Dubach, CEO, Toradex. “Decreasing the usual customization and work required for compatibility ensures Toradex helps customers get started quickly on their IoT solution.” “Microsoft Azure Certified for IoT extends our promise to bring IoT to business scale, starting with interoperable solutions from leading technology companies around the world,” said Barb Edson, general manager for Data Platform and Internet of Things, Microsoft. “With trusted offerings and verified partners, Microsoft Azure Certified for IoT accelerates the deployment of IoT even further.” IoT projects are complex and take a long time to implement. Customers find that choosing and connecting the right set of devices, assets or sensors to the cloud can be time-consuming. To jumpstart their IoT projects with confidence, customers are looking for certified devices and platforms that are tested for readiness, compatibility and usability with the Microsoft Azure IoT Suite. By choosing a partner from the Microsoft Azure Certified for IoT program, customers can save time and effort on project specs and RFP processes by knowing in advance what devices and offerings will work with the Azure IoT Suite. To learn more about Azure IoT Suite, click here. About Toradex Toradex is a Swiss based company with offices around the world, offering ARM based System on Modules (SOMs) and Customized SBCs. Powered by NXP®/Freescale i.MX 6, i.MX 7 & Vybrid, and NVIDIA ® Tegra processors, the pin compatible SOMs offer scalability in terms of price, performance, power consumption, and I/Os. Complemented with direct online sales and long-term product availability, Toradex offers direct premium support and ex-stock availability with local warehouses.
查看全文
DSC_0059 Added by iWavesystems on April 3, 2012 at 8:41am    
查看全文
Habey USA's EMB-2230 is a Pico-ITX ARM Board built around an NXP i.MX6 Cortex A9 processor, featuring an expansion header like those found in development and hobbyist boards - but optimized for use in commercial and industrial products. The board brings the ease and adaptability of "maker-boards", and the reliability and long product availability life-cycle of commercial ARM boards together. Specifications: NXP i.MX6 Cortex A9 Processor 1GB On-Board DDR3 8GB (default) iNAND Flash Storage Ethernet, BT 4.1BLE, 802.11b/g/n Dual LCD Interfaces (24-Bit LVDS and MIPI DSI) Rich I/O Options, Including HDMI 3* USB (2* USB 2.0, 1* USB OTG) USB & RS-232 Headers RS-485/CAN Bus Terminal Block 40-Pin Expansion Header w/ PCIe. PoE Input Accelerate Time To Market With a Developer Friendly Design The inclusion of easy-development features on the EMB-2230 helps ensure a quick time to market for any product. Included expansion headers, terminal blocks, and available 7" and 10" touch panel LCD kits make it easy to connect any sensors, hardware, or other devices and move your product directly from development to production. Easy Expansion Modules - Or Use With Other Devices Habey offers a variety of ready to use expansion modules, offering features like PoE (Power over Ethernet) or dual GbE ports; or use the 40 pin expansion header with PCIe, CAN Bus, RS-232, UART, SPI, I2C, and other GPIO devices. Commercial Product Life-Cycle Reliability The EMB-2230 has been carefully designed for a five-to-ten year long service life-cycle, ensuring no need to constantly test and recertify products as components change. Learn more: EMB-2230 Datasheet EMB-2230 Product Page Tech News - The Maker Board Explosion and Industrial ARM Boards
查看全文
MYIR introduces a high-performance ARM SoM MYC-JX8MX CPU Module, which is built around the NXP i.MX 8M Quad processor featuring 1.3GHz quad ARM Cortex-A53 cores and a real-time ARM Cortex-M4 co-processor. The module runs Linux and is capable of working in extended temperature ranging from -30°C to 80°C.   Measuring 82mm by 52mm, the MYC-JX8MX CPU Module has integrated 1GB/2GB LPDDR4, 8GB eMMC, 256Mbit QSPI Flash, Gigabit Ethernet PHY and PMIC on board. A large number of I/O signals are carried to or from the i.MX 8M CPU Module through one 0.5mm pitch 314-pin MXM 3.0 expansion connector, making it an excellent embedded solution for Scanning/Imaging, Building Automation and Smart Home, Human Machine Interface (HMI), Machine Vision and more other consumer and industrial applications which requires high multi-media performance. MYC-JX8MX CPU Module (delivered with heat sink by default) MYIR also offers a versatile platform MYD-JX8MX development board for evaluating the MYC-JX8MX CPU Module. It takes full features of the i.MX 8M processor and has brought out rich peripherals through connectors and headers such as 4 x USB 3.0 Host ports and 1 x USB 3.0 Host/Device port, Gigabit Ethernet, TF card slot, USB based Mini PCIe interface for 4G LTE Module, WiFi/BT, Audio In/Out, HDMI, 2 x MIPI-CSI, MIPI-DSI, 2 x LVDS display interfaces, PCIe 3.0 (x4) NVMe SSD Interface, etc. It is delivered with necessary cable accessories for customer to easily start development as soon as getting it out-of-box. A MIPI Camera Module MY-CAM003 is provided as an option for the board.                                                                       MYD-JX8MX Development Board MYIR offers 1GB or 2GB RAM selections for the CPU modules and development boards which have very-high powered prices to compare. Part No. Item Processor LPDDR4 eMMC Unit Price MYC-JX8MQ6-8E1D-130-E MYC-JX8MX  CPU Module NXP i.MX 8M Quad Processor based on 1.3GHz Quad ARM Cortex-A53 and 266MHz Cortex-M4 cores  (MIMX8MQ6CVAHZAB) 1GB 8GB $99 MYC-JX8MQ6-8E2D-130-E 2GB $119 MYD-JX8MQ6-8E1D-130-E MYD-JX8MX Development Board 1GB $279 MYD-JX8MQ6-8E2D-130-E 2GB $299 Supports extended working temperature ranging from -30°C to 80°C.
查看全文
Hi all.  The display does not output normally. 1. This is the screen of the problem. 2. This is a screen that should come out normally.  Therefore, it is necessary to review whether the settings are correct in the bootloader and the kernel. Below are the system information and tasks. - Hardware system Module: Apalis iMX6, Ixora Carrier Board v1.1 LVDS 2 port : LA123WF4-SL05, 12.3”WU (1920 X RGB X 720) TFT- LCD - Operation system boot2qt : Boot to Qt for Embedded Linux 2.3.4 bootloader: U-Boot 2016.11-dirty kernel : Linux version 4.1.44-2.7.5+g18717e2 - LCD timing - Device Tree of kernel, arch/arm/boot/dts/imx6qdl-apalis.dtsi mxcfb1: fb@0 { compatible = "fsl,mxc_sdc_fb"; disp_dev = "ldb"; interface_pix_fmt = "RGB24"; default_bpp = <24>; int_clk = <0>; late_init = <0>; status = "disabled"; // "okey" in arch/arm/boot/dts/imx6qdl-apalis-ixora-v1.1.dtsi }; &ldb { status = "okay"; split-mode; // dual-mode; lvds-channel@0 { reg = <0>; fsl,data-mapping = "spwg"; /* "jeida"; */ fsl,data-width = <24>; crtc = "ipu2-di1"; primary; status = "okay"; display-timings { native-mode = <&timing01>; timing01: 1920x720 { clock-frequency = <89400000>; hactive = <1920>; vactive = <720>; hback-porch = <96>; hfront-porch = <30>; vback-porch = <3>; vfront-porch = <3>; hsync-len = <2>; vsync-len = <2>; }; }; }; lvds-channel@1 { reg = <1>; fsl,data-mapping = "spwg"; fsl,data-width = <24>; crtc = "ipu1-di0"; status = "okay"; display-timings { timing02: 1920x720 { clock-frequency = <89400000>; hactive = <1920>; vactive = <720>; hback-porch = <96>; hfront-porch = <30>; vback-porch = <3>; vfront-porch = <3>; hsync-len = <2>; vsync-len = <2>; }; }; }; }; - u-boot env vidargs=video=mxcfb0:dev=ldb,1920x720@60,if=RGB24, video=mxcfb1:off video=mxcfb2:off video=mxcfb3:off - kernel log : [ 0.244330] MIPI DSI driver module loaded [ 0.244682] ldb 2000000.aips-bus:ldb@020e0008: split mode [ 0.244951] ldb 2000000.aips-bus:ldb@020e0008: split mode or dual mode, ignoring second output [ 0.245615] 20e0000.hdmi_video supply HDMI not found, using dummy regulator [ 0.247074] mxc_sdc_fb fb@0: registered mxc display driver ldb [ 0.262134] mxc_sdc_fb fb@0: 1920x720 h_sync,r,l: 2,30,96 v_sync,l,u: 2,3,3 pixclock=89405000 Hz [ 0.272800] imx-ipuv3 2800000.ipu: IPU DMFC DP HIGH RESOLUTION: 1(0,1), 5B(2~5), 5F(6,7) [ 0.306740] mxc_sdc_fb fb@0: 1920x720 h_sync,r,l: 2,30,96 v_sync,l,u: 2,3,3 pixclock=89405000 Hz [ 0.354510] Console: switching to colour frame buffer device 240x45 [ 0.389237] mxc_sdc_fb fb@1: mxcfb1 is turned off! [ 0.389484] mxc_sdc_fb fb@2: mxcfb2 is turned off! [ 0.389720] mxc_sdc_fb fb@3: mxcfb3 is turned off! : - Run fbset of target root@b2qt-apalis-imx6:~# fbset mode "1920x720-60" # 😧 89.405 MHz, H: 43.655 kHz, V: 59.966 Hz geometry 1920 720 1920 1440 24 timings 11185 96 30 3 3 2 2 accel false rgba 8/16,8/8,8/0,0/0 endmode Is there anything else to check? Thanks.
查看全文
http://www.youtube.com/watch?feature=player_embedded&v=RzmsFxb3EcQ   Uploaded by Digidotcom on Jul 30, 2010 For more information visit http://www.digi.com/products/embeddedsolutions/connectcore-wi-mx51.jsp. Category: Science & Technology License: Standard YouTube License  
查看全文
Dear,   Now we could use USB port for program downloading, do we have serial port download tools for i.mx6UL? For Jlink tools, now it could start running with Jlink, but after power off and power on, the target will not start. Do you have the image which could download to QSPI norflash by Jlink, and after power on, the target board could running.
查看全文
Airbus connected factory to shorten Time To Market, Remy’s Martin connected bottle to avoid counterfeit, Schlindler’s elevator smart sensors to improve security, Cisco-IBM connected port in Colombia to enable predictive maintenance, these are some successful examples of B2B IoT creating value and business, and there are many more to come. MACKINSEY ASSESS THAT 70% OF POTENTIAL VALUE ENABLED BY IOT SHOULD COME FROM B2B! McKinsey Global Institute – “The internet of Things: mapping the value beyond the hype” – June 2015 A growing number of companies understand the potential of IoT for B2B markets and its trillions dollars’ revenue expected in 2020 (from 3 to 20 depending on sources and studies). That said, you don’t develop a bluetooth key ring the same way as a sensor designed to monitor temperature in a hot caustic reactor lost in the middle of nowhere and requiring 99,9% availability. While B2C IoT main challenges will remain business application and datamining, B2B brings an additional complexity to the device and its direct environment (gateway, other IoT devices, IT, etc). That is why we make a distinction between the “sexy” IoT focused on B2C and its challenges (marketing, business model, retention, etc.) and what we call the “serious IoT” which is more related to industrial and B2B stakes. This article is the first of a series where I aim to describe the whole process of IoT project development, from a business point of view as well as a technical point of view I will start with this first article by giving what I believe is the best methodology to start a BtoB or Industrial IoT project.   What are the challenges of serious IoT? What are the key success factors to launch a product? What to begin with and which steps to follow? THE FOUR PILARS TO SUCCEED IN AN IoT PROJECT Before I dig into the process to follow, let’s share some key success factors that I’ve identified in all the IoT projects I’ve seen and run: Design thinking As IoT is “hype”, many companies want IoT to launch a project and forget that simple saying: “no pain, no gain”. If there is no pain to be addressed with the project, it will certainly end up in the archive box of the data room. Design thinking allows to have a consumer-centric approach at each stage of the development and ensures your project/product relieves pain, brings a benefit for the customer (even if customer is internal). Master a wide range of technologies MacKinsey assess that system interoperability represent 40% of the potential value of IoT revenue. The “inter” of interoperability means that companies would need partners mastering many different technologies to have all layers/devices work together. In the embedded/IoT world, this can easily exceed 50 technologies (HW architectures, OS, radio & network protocols, frameworks, applications, etc.).  So the success of an IoT project, and more widely of an embedded project, is moving from a technical “silo” expertise to a system approach coupled with technical expertise. Designing the device itself also requires a wide range of expertise and a system approach to optimize the whole system based on business application requirements. Reliable partners (either for technologies or distribution channel) This is often called ‘open innovation’, a term that can freak out CEOs or CTOs. It is simply the fact that you build your project involving partners at each stage to create more value.  As IoT impacts every single bloc of the business model (distribution channel, revenue mode, communication, key activities, key resources, etc.), not a single company can have every related asset internally. So finding the right partners, and sharing value with them, is key to manage and roll-out the project Agile approach This is another “buzz” word. But it is not so obvious for companies not coming from the software industry or coming with a pure embedded software mindset and its 'waterfall approach'. IoT sees many new comers discovering the software challenges, and trying to apply their regular development processes (V cycle for example) to the IoT project. That is the best way to burn it in endless discussions on product scope, spend a lot of money on redeveloping things, and delaying your project launch forever. WHERE AND HOW TO START YOUR IoT PROJECT? Now you’re thinking: “Hmm, interesting, thanks Mr Consultant for this completely un-operational advice. But that doesn’t help me to start”. Don’t you leave now, here is the practical part! These are the first steps to follow when you want to manage an IoT project: 1. START WITH ''WHY'' As Simon Sinek would say, you’d better start with the “why” before launching any useless project. So, why do I want to launch an IoT project? Do I want to launch something that makes my company look trendy and innovative? Do I want to save cost by optimizing my business processes (maintenance, operation, production, etc.)? Do I want to enable new business models into my company offer, thanks to the IoT opportunities (renting vs selling, data value, new services, service vs product, etc.)? Do I want incremental innovation to refresh some of my products? Do I want to use the project as a Trojan horse to digitalize my company? Over the past few years, I have seen all of these motivations among management teams, and all of them are fine. But, you cannot pursue all those goals at the same time, and you certainly won’t design the same project depending on the choice you make. As we say in French “choisir, c’est renoncer” which would translate into something like “Choosing is giving up”. So take time to clearly state your motivations and then select one that needs to guide your focus in the coming months. 2. DESIGN USE CASES AND MAKE ASSUMPTIONS  Easier said than done, but first forget about technology/product, and just think about what IoT could allow in your environment and to which customer this could be most valuable. Draw several customer “journeys” and see where innovation could be used as painkiller or gain creator. Let’s take the example of a maintenance scenario. The idea is the allow remote action for on field devices. For instance, coffee machines installed into gas stations all over Europe. In that case, ask yourself how IoT could make maintenance more efficient? Try to assess time gain, money gain, and security gain and quantify it. Let’s say you identified that among 1000 machines installed, you have a high chance of having 5 customer claims per week and therefore 5 diagnosis to be done per week. Can IoT help you run the diagnosis remotely? Can IoT help you solve the problem remotely? In that case, will that save all on site trips? How much money would that save for the company operating the machines? Knowing that, you can start building a first draft of business models making assumptions: how much of that value can you take? What is the business model you can build around that? How much will it affect your customer process? Have you got the right distribution channel to sell this new offer? Which key assets and activities would you need to bridge the gap between current status and this innovation? 3. GET OUT THE BUILDING Use cases and key assumptions in your pocket, you will now need to go and meet potential customers and partners. The more you share, the more your project will evolve to a credible scenario. Who in your existing base can be your early adopters? Who are your customer having the pain you ease at the highest level (and it is even better if they try to solve it themselves with a workaround). In our example of remote maintenance, they would have some artisanal webcam system on each site to see the machine state and detect some issues without any on-site intervention. Once you’ve identified 5 to 10 contacts, go out and meet them, and try to understand several things : the high level stakes, the problem they have on the field, the way they have tried to solve it, the change process and stakeholder, and then (and only then) you can present your innovation and collect feedbacks. A few slides are enough to present. There is no need for a prototype or any bigger investment. You will be amazed on the quantity of information you can collect that way. And remember something: don’t listen to what people say, look at (or try to understand) what they actually do. 4. BUSINESS MODEL AND FUNCTIONAL SPECIFICATIONS You had your first iteration, congratulations! You wrote down assumptions, you went on the ground to test them, and you collected valuable insights from your targeted customers. Maybe your assumptions proved fully wrong, then go back to stage 2! Otherwise, lucky you, you can write down a v1 of the business model and define your product functional specifications better. This is where you can start defining features, functionalities, prices, offers, channels, technical constraints, cost, financial figures, etc. At the end of this stage you will have some kind of a business plan, a sales pitch, functional specifications, and maybe even technical specifications for your IoT project. 5. POC, POC, POC  That is one of the hardest part of any innovative project: build a Proof Of Concept and test it. Questions are: what are the key features/attributes that I need to test to prove that my concept makes sense for customers? How can I do that as cheap as possible in order to keep my budget for the real product? You’ll need to be very smart, or pay some smart provider, to be able to degrade your end vision so much to just keep the key attributes you want to test. If we go back to the remote maintenance example, can you build some basic software on a Raspberry Pie Board connected to the machine, coupled with a basic web interface that give critical information on the machine, for instance power consumption, run time, temperature, etc. Even if the final product won’t be using raspberry, if you want the web interface to be embedded into an app, and if you want to have twice as much indicators, just focus on the key elements. And test. Doing so, you’ll allow your customer to see real progress, to feel involved in the development process, and to influence the final outcome. And on your side you will collect key information that would take months or even years to collect if you had done it on the real product. A Proof Of Concept can be a functional prototype, or a design prototype, or both. That is pretty much depending on the project and again on the key attributes/functionalities you want to test. 6. ANOTHER LOOP TO COME Congratulation, you’ve made another loop. You are about to become expert in so called “iterative development”! If you don’t feel so, don’t worry as you’ll have many other loops following the same process: make assumptions, test, measure, learn, adjust and make new assumptions, test, measure … Each loop will allow you to adjust the business model, the functional specifications, the customer engagement and go further into your product development. The complete ''Lean startup process'' The key is to keep in mind that your goal here is not to have the perfect product. It is just to be able to learn as much as possible in each loop while spending as less as possible. Make as many loops as you can until you reach a satisfying v1 product brief. But that is for chapter 2… Originally Written on WITEKIO Technical Blog by Samir Bounab, Chief Sales Officer, WITEKIO 15 September 2017
查看全文
The i.MX53 General Market Launch at Embedded World Added by Marsha Chang on March 8, 2011 at 6:34pm The i.MX53 launch was successful at Embedded World in Nuremberg, Germany. We had many partners demoing i.MX53 solutions as well as the i.MX53 Quick Start board showing HD1080p video decode and the i.MX53 Tablet reference design (SABRE Platform for Tablets). For more information, visit www.freescale.com/imx53.  
查看全文
This the document for SUSPEND and WAKE-UP in IMX8QM-MEK platform Wake up done by following, M4  debug UART A53 debug UART Power Key Source: imx-p9.0.0_2.1.1-auto-ga CONFIGURATION          Disable the default wake-up - RTC Timer (Patch - Disable_auto_wakeup.diff) SUSPEND      1. Command  Line,             # echo mem > /sys/power/state      2. Dumpsys Command                  # dumpsys activity service com.android.car inject-vhal-event 0x11410a00 4,2 WAKE-UP     1. BUTTON          By pressing the power-key SW3 (0.5s)      2. DEBUG UART          Enable debug UART wake-up          # cat ./sys/devices/platform/5a060000.serial/tty/ttyLP0/power/wakeup          disabled          # echo enabled  > ./sys/devices/platform/5a060000.serial/tty/ttyLP0/power/wakeup          TO WAKEUP                   Press the ENTER Key 2 times 3. M4 UART WAKEUP                        Apply the RPMSG patch (L4.14.98.diff)                                        # cat./sys/devices/platform/5a070000.serial/tty/ttyLP1/power/wakeup          disabled          # echo enabled > ./sys/devices/platform/5a070000.serial/tty/ttyLP1/power/wakeup Type some input in M4 Cosole               
查看全文
Hi. We are using iMX6 SABRESD board with ltib-4.1.0. We need blit functionality for OpenGL framebuffer objects. It is present in OpenGL GL_EXT_framebuffer_blit extension, but is missing on iMX6. We use QOpenGLFramebufferObject class from Qt. There is blitFramebuffer() method that was working fine on desktop and Android devices. Its method hasOpenGLFramebufferBlit() returns false on iMX6 which means extension is not present. As expected, blitFramebuffer() itself is not working (screen remains black). We need this extension enabled. Why is it missing on iMX6 and is it possible to add it? Regards, Federico
查看全文
This 4 days in-depth technical training targets OEMs and customers starting the development of a Linux + Android based device with ARM architecture. It covers all the aspects related to the use of Linux and Android for Embedded system, including kernel architecture, development tools and Environment, BSP adaptation and custom drivers development and Android image creation, deployment and debugging. With a 3-days “base content” plus 1 optional day on advanced debugging technique and advanced drivers development concepts, this course offers the maximum flexibility to match attendees expectation. As part of its collaboration with Freescale, Adeneo Embedded will offer to each attendee the i.MX 6 series development board built to Freescale SABRE Lite design used during the training, to allow customers continuing their evaluation and development after the class. For registration : Invitation_Training_Android_Boston_July_2014 / All static html / Media - Adeneo Embedded
查看全文
Bitcoin is a cryptocurrency which is quite popular among many investors, tech enthusiasts, and some digital sellers/buyers due to its flexible, anonymous and robust nature.  BFG Miner is a bitcoin miner which has the ability to mine bitcoins on a range of devices from ASIC, to FPGA, to GPU, to obsolete CPU systems. This article will guide you step by step to do bitcoin mining on a i.MX8x platform by using the bfgminer. 1) Download the necessary software. bfgminer https://github.com/luke-jr/bfgminer.git jansson https://github.com/akheron/jansson.git uthash https://github.com/troydhanson/uthash.git 2) Cross compile the software: bfgminer: ./configure --prefix=${YourDirectory} --host=aarch64-linux-gnu --enable-scrypt --enable-cpumining --without-libevent --without-libmicrohttpd make jansson ./configure --prefix=${YourDirectory} --host=aarch64-linux-gnu make If everything runs correctly, you should get the following binaries and libraries: Ubuntu14:/opt/output$ ls -R .: bin include lib sbin share ./bin: bfgminer bfgminer-rpc start-bfgminer.sh ./include: jansson_config.h jansson.h libbase58.h libblkmaker-0.1## ./include/libblkmaker-0.1: blkmaker.h blkmaker_jansson.h blktemplate.h ./lib: libbase58.la libbase58.so.0 libblkmaker-0.1.la libblkmaker-0.1.so.6 libblkmaker_jansson-0.1.la libblkmaker_jansson-0.1.so.6 libjansson.a libjansson.so libjansson.so.4.10.0 libbase58.so libbase58.so.0.0.2 libblkmaker-0.1.so libblkmaker-0.1.so.6.1.0 libblkmaker_jansson-0.1.so libblkmaker_jansson-0.1.so.6.1.0 libjansson.la libjansson.so.4 pkgconfig 3) Install those binaries and libraries onto the i.MX8x target filesystem under directory /usr/bin and /usr/lib. Run the following command to start mining: #bfgminer -o stratum+tcp://us.ss.btc.com:1800 -u nxa001.001 -p ""  
查看全文