App note: DMX512 receiver and transmitter on a PIC

DMX512 is a 3 wire interface protocol used in professional lighting. This app note by Microchip explains all the protocol details and how to implement it on any PIC microcontroller with a hardware UART peripheral. Only external part is a RS485 transceiver.

DMX512 transmitter sends 512 8bit data packets one after the other to 512 individual receivers connected on the same 3 wire bus. The receivers have preprogrammed addresses (one of 512) and wait for their packet to be sent in line. Both the transmitter and receivers use a 250Khz clock and are synced up using start and stop bits at the beginning and end of transmission.

This app note covers all the information you need to build a DMX512 transmitter or receiver. Source code is also provided, as well as all the UART register initialization to get you started on your DMX projects.

Join the Conversation


  1. One thing about microchips DMX break detection in the receiver half. It does not seem to work well with some members of the PIC18 family particularly the PIC 18 “K” series. There are a few threads on the microchip forum covering this.

    1. The Livid Instruments Brain is a PIC16F67J50 based DIY board which has a header designed for an off-board RS485 transceiver. The extra I/O pin needed is also available. I never did build the necessary daughter board, but it’s been on my to-do list ever since I designed the Brain. The Brain is primarily designed for USB-MIDI operations from a buttons and pots (faders and knobs) control board with LEDs, so DMX512 support seemed important to design in.

      JTR: Do you have a link to that Microchip forum topic? It’s been ages since I hung out there, and I remember it’s pretty huge to navigate.

      1. Sorry I don’t have the links. I seem to recall that there are several (3+) threads that I have been involved in. I have pointed out the the microchip way of detecting a break is not sound. It is possible for the break detection to also catch part of the sync byte and throw the receiver out but one byte. This is what others have also found.

        In one of these threads there is a trick way to generate the break by changing the baud rate to something much lower. This saves having to use a separate I/O pin. There are other ways too. I disable the UART and make the pin an output and time the break very accurately.

  2. Hmm, I thought there was something about disabling and re-enabling the UART that would not work, but maybe I’m thinking of another processor besides the PIC and another problem altogether.

    While we’re on the topic, what processors implement DMX512 without the break detection problems that the PIC suffers? Are there any UART peripherals which handle DMX512 more seamlessly and directly? Does it require a custom UART or custom logic?

    I enjoy working with the PIC, but I’m not married to Microchip. If there’s a better chip for DMX512 then I’m all over it…

    1. Well you cannot argue against what is working. Yes you can disable the UART and take control of the I/O pin without problem. I am doing exactly that.

      No PIC has any real means for detecting the break as a specific DMX break. Many PICs have a break detection for the LIN bus but this is a fixed period and is always 13-bits. Not good enough for DMX.

      However, a DMX break detection can be coded into the firmware without the requirement for any additional hardware. Every system I have seen (and used) starts out by detecting a framing error. The problem is then what about the timing of the rest of the break. How do you stop the rest of the break and the MAB from being mistaking as the ZERO sync character. The timing here is critical. If the break is close to being an exact multiple of one character period then the last part of the break along with the MAB can be mistaken for a valid sync character.

      It is a matter of timing and because the timing is variable the problem may happen never, sometimes or all of the time. This is exactly what others have found.

      The only real robust solution is to poll the RX line once you have detected the initial framing error. Anything else is a “fair weather” solution (however that is how almost, if not all, the code I have seen only continually polls for framing error followed by a valid sync and it works “mostly.”)

      As it is, the microchip code has been shown to mostly work on the standard PIC18 family but for reasons that are not known, it fails on the PIC18 “K” series, well according to what others are telling me. Why I do not know. On paper the UART is pretty much the same…

      If you draw pictures in your head of the timing during the break you will see that what I mean about the end of the break/start if the MAB appearing as a valid sync character and be able to code something better than the microchip sample. What is really needed is bit by bit resolution and not character by character that the hardware UART provides.

      1. Sounds like a perfect opportunity for one of those MPU chips that combine FPGA features with a processor. Didn’t Ian discuss a new PIC with this capability?

      2. Finding BREAK is actually really simple, and it was deliberately designed to be!
        You’re simply looking for a “low” that lasts more than 96uS.
        So, when you detect a framing error, start a timer with interrupt-on-change or CCP input and check it’s ‘low’.

        When the low to high edge occurs, if it’s been low for more than 56uS then it was BREAK, it not, it was noise. That is much longer than the 36uS that a “0” byte would be, so a mistake will not occur.

        Personally, I think AN1076 is responsible for a huge number of problems in cheap LED lights, because it will misinterpret noise as BREAK, and can easily misinterpret the 0 start code as BREAK.

Leave a comment

Your email address will not be published. Required fields are marked *

Notify me of followup comments via e-mail. You can also subscribe without commenting.