McSema Slack Chat

McSema is an executable lifter. It translates ("lifts") executable binaries from native machine code to LLVM bitcode. LLVM bitcode is an intermediate representation form of a program that was originally created for the retargetable LLVM compiler, but which is also very useful for performing program analysis methods that would not be possible to perform on an executable binary directly.

McSema enables analysts to find and retroactively harden binary programs against security bugs, independently validate vendor source code, and generate application tests with high code coverage. McSema isn’t just for static analysis. The lifted LLVM bitcode can also be fuzzed with libFuzzer, an LLVM-based instrumented fuzzer that would otherwise require the target source code. The lifted bitcode can even be compiled back into a runnable program! This is a procedure known as static binary rewriting, binary translation, or binary recompilation.

McSema supports lifting both Linux (ELF) and Windows (PE) executables, and understands most x86 and amd64 instructions, including integer, X87, MMX, SSE and AVX operations. AARCH64 (ARMv8) instruction support is in active development.

Using McSema is a two-step process: control flow recovery, and instruction translation. Control flow recovery is performed using the mcsema-disass tool, which relies on IDA Pro, Binary Ninja, or DynInst to disassemble a binary file and produce a control flow graph. Instruction translation is then performed using the mcsema-lift tool, which converts the control flow graph into LLVM bitcode. Under the hood, the instruction translation capability of mcsema-lift is implemented in the remill library. The development of remill was a result of refactoring and improvements to McSema, and was first introduced with McSema version 2.0.0. Read more about remill here.

McSema and remill were developed and are maintained by Trail of Bits, funded by and used in research for DARPA and the US Department of Defense.

Build status

Linux Build Status



Why would anyone translate binaries back to bitcode?

Comparison with other machine code to LLVM bitcode lifters

McSema dagger llvm-mctoll retdec reopt bin2llvm fcd RevGen Fracture libbeauty
Actively maintained? Yes No Yes Yes Yes No Maybe Maybe Maybe No Yes
Commercial support available? Yes No No No Maybe No No No No Maybe No
LLVM versions 3.5 - current 5 current 4.0 3.8 3.8 3.2 4 3.9 3.4 6
Builds with CI? Yes No No Yes No No Yes Maybe Maybe No No
32-bit architectures x86 x86 ARM x86, ARM, MIPS, PIC32, PowerPC ARM, MIPS S2E S2E S2E ARM, x86
64-bit architectures x86-64, AArch64 x86-64, AArch64) x86-64 x86-64, arm64 & more x86-64 x86-64 S2E S2E PowerPC x86-64
Control-flow recovery IDA Pro, Binary Ninja, DynInst Ad-hoc Ad-hoc Ad-hoc Ad-hoc Ad-hoc Ad-hoc Ad-hoc McSema Ad-hoc Ad-hoc
File formats ELF, PE ELF, Mach-O ELF, PE, Mach-O, COFF, AR, Intel HEX, Raw ELF ELF ELF ELF, PE ELF, Mach-O (maybe) ELF
Bitcode is executable? Yes Yes Yes Yes Yes Yes No No CGC No No
C++ exceptions suport? Yes No No No No Indirectly No No No No Maybe
Lifts stack variables? Yes No Maybe Yes No No No Yes No No Maybe
Lifts global variables? Yes Maybe Yes Yes No Maybe No No No Yes Maybe
Has a test suite? Yes No Yes Yes Yes Yes Yes Yes No Yes No

Note: We label some architectures as "S2E" to mean any architecture supported by the S2E system. A system using "McSema" for control-flow recovery (e.g. RevGen) uses McSema's CFG.proto format for recovering control-flow. In the case of RevGen, only bitcode produced from DARPA Cyber Grand Challenge (CGC) binaries is executable.


Name Version
Git Latest
CMake Latest
Google Protobuf 2.6.1
Google Flags Latest
Google Log Latest
Google Test Latest
Intel XED Latest
LLVM 3.5+
Clang 3.5+
Python 2.7
Python Package Index Latest
python-protobuf 3.2.0
IDA Pro 7.1+
Binary Ninja Latest
Dyninst 9.3.2

Getting and building the code


Step 1: Clone McSema

git clone --depth 1
cd mcsema

Step 2: Add your disassembler (optional)

Currently IDA, Binary Ninja, and Dyninst are supported for control-flow recovery, it's left as an exercise to the reader to install your disassembler of choice in a Dockerfile, but an example of installing Binary Ninja is provided (remember for Docker that paths need to be relative to where you built from):

ADD local-relative/path/to/binaryninja/ /root/binaryninja/
ADD local-relative/path/to/.binaryninja/ /root/.binaryninja/ # <- Make sure there's no `lastrun` file
RUN /root/binaryninja/scripts/

Step 3: Build & Run Dockerfile

This will build the container for you and run it with your local directory mounted into the container (at /mcsema/local) such that your work in the container is saved locally:

# build mcsema container
ARCH=amd64; UBUNTU=18.04; LLVM=800; docker build . \
  -t mcsema:llvm${LLVM}-ubuntu${UBUNTU}-${ARCH} \
  -f Dockerfile \
  --build-arg UBUNTU_VERSION=${UBUNTU} \
  --build-arg LLVM_VERSION=${LLVM} \
  --build-arg ARCH=${ARCH}

# run mcsema container
docker run --rm -it --ipc=host -v "$(pwd)":/mcsema/local mcsema:llvm${LLVM}-ubuntu{$UBUNTU}-${ARCH}

Native Build On Linux

Step 1: Install dependencies

sudo apt-get update
sudo apt-get upgrade

sudo apt-get install \
     git \
     curl \
     cmake \
     python2.7 python-pip python-virtualenv \
     wget \
     build-essential \
     gcc-multilib g++-multilib \
     libtinfo-dev \
     lsb-release \
     zlib1g-dev \

For Ubuntu 16.04 and 14.04 you also need to install the realpath package.

If you are going to be using IDA Pro for CFG recovery also do the following:

sudo dpkg --add-architecture i386
sudo apt-get install zip zlib1g-dev:i386

Step 1.5 (Optional): Create a virtualenv for your McSema installation

Using a virtualenv ensures that your mcsema installation does not interfere with other software packages. This setup is especially helpful if you are hacking on mcsema and want to avoid clobbering a global, working version with development code.

mkdir mcsema-ve
virtualenv mcsema-ve
cd mcsema-ve
source bin/activate
Fixing IDA Pro's Python installation (Ubuntu 14.04)

Note: If you are using IDA on 64 bit Ubuntu and your IDA install does not use the system Python, you can add the protobuf library manually to IDA's zip of modules.

# Python module dir is generally in /usr/lib or /usr/local/lib
GOOGLEMODULE=$(python -c "import os; import sys; import google; sys.stdout.write(os.path.dirname(google.__file__))")
pushd ${GOOGLEMODULE}/..
chmod +w ${IDAPYTHON}
zip -rv ${IDAPYTHON} google/
chmod -w ${IDAPYTHON}

Step 2: Clone the repository

The next step is to clone the Remill repository. We then clone the McSema repository into the tools subdirectory of Remill. This is kind of like how Clang and LLVM are distributed separately, and the Clang source code needs to be put into LLVM's tools directory.

Notice that when building McSema, you should always use a specific Remill commit hash (the one we test). This hash can be found in the .remill_commit_id file.

git clone --depth 1
export REMILL_VERSION=`cat ./mcsema/.remill_commit_id`

git clone
cd remill
git checkout -b temp ${REMILL_VERSION}

mv ../mcsema tools

Step 3: Build McSema

McSema is a kind of sub-project of Remill, similar to how Clang is a sub-project of LLVM. To that end, we invoke Remill's build script to build both Remill and McSema. It will also download all remaining dependencies needed by Remill.

The following script will build Remill and McSema into the remill-build directory, which will be placed in the current working directory.

if [ -z "${VIRTUAL_ENV}" ]
  # no virtualenv; global install for all users
  # found a virtualenv; local install
  ./scripts/ --prefix $(realpath ../)

In case you want to use Dyninst as the frontend do this instead (after Dyninst 9.3.2 has been installed to the standard /usr/local location):

export CMAKE_PREFIX_PATH=/usr/local/lib/cmake/Dyninst
if [ -z "${VIRTUAL_ENV}" ]
  # no virtualenv; global install for all users
  ./scripts/ --dyninst-frontend
  # found a virtualenv; local install
  ./scripts/ --dyninst-frontend --prefix $(realpath ../)

Details can be found in Dyninst frontend.

This script accepts several command line options:

Step 4: Install McSema

The next step is to build the code.

cd remill-build
if [ -z "${VIRTUAL_ENV}" ]
  # no virtualenv; global install for all users requires sudo
  sudo make install
  # found a virtualenv; local install does not need root
  make install

Once installed, you may use mcsema-disass for disassembling binaries, and mcsema-lift-4.0 for lifting the disassembled binaries. If you specified --llvm-version 3.6 to the script, then you would use mcsema-lift-3.6.

Step 5: Verifying Your McSema Installation

In order to verify that McSema works correctly as built, head on over to the documentation on integration tests. Check that you can run the tests and that they pass.

On Windows

Step 1: Installing the toolchain

Visual Studio

  1. Click on "Tools for Visual Studio 2019" and download the "Build Tools for Visual Studio 2019" installer from the Visual Studio downloads page
  2. Select "MSVC v142 - VS 2019 C++ x64/x86 build tools" and confirm the installation


  1. Get the LLVM 9 (x64) installer from the LLVM download page:
  2. Do NOT enable "Add to PATH"


  1. Get the latest Python 2.7 (X64) installer from the official download page:
  2. Enable "Add to PATH"


  1. Download the CMake (x64) installer from
  2. Enable "Add to PATH"

Step 2: Obtaining the source code

git clone --depth=1
git clone --depth=1 remill/tools/mcsema

Note that for production usage you should always use a specific remill commit (remill/tools/mcsema/.remill_commit_id) when building mcsema. At the time of writing, it is however best to use HEAD (or at least make sure that commit e7795be is present in the remill branch).

cd remill
git fetch --unshallow
git checkout -b production `cat tools/mcsema/.remill_commit_id`

Step 3: Enabling the LLVM toolchain for Visual Studio

Download the official extension from the market place:

Automatic installation

Only works for the full Visual Studio IDE. Double clicking the extension should automatically install it.

Manual installation

The extension is in fact a ZIP archive; extract it and copy the VCTargets folder to the right location.

Step 4: Dependencies

Its time to fetch library dependencies. You can either build them yourself using our cxx-common dependency manager or download a pre-built package.

There are two versions of LLVM used by remill and mcsema. One version (currently 7.0.1) builds remill and mcsema. Another version (currently 5.0.1) is used to build the translation semantics.

On Windows, only the LLVM 5.0.1 package is supported for building semantics. If you build it yourself, use the Visual Studio 2017 Win64 generator with the LLVM 5.0.1 toolchain. The cxx-common script will automatically take care of this requirement.

Binaries (extract to C:\Projects\tob_libraries)

Step 5: Building

Make sure to always execute the vcvars64.bat script from the "x64 Native Tools Command Prompt": C:\Program Files (x86)\Microsoft Visual Studio\2017\BuildTools\VC\Auxiliary\Build\vcvars64.bat.

mkdir remill_build
cd remill_build

cmake -G "Visual Studio 16 2019" -T llvm -A x64 -DCMAKE_BUILD_TYPE=Release -DLIBRARY_REPOSITORY_ROOT=C:\Projects\tob_libraries -DCMAKE_INSTALL_PREFIX=C:\ ..\remill
cmake --build . --config Release -- /maxcpucount:%NUMBER_OF_PROCESSORS%

If you are using a recent CMake version (> 3.13) you can also use the newly introduced cross-platform -j parameter:

cmake --build . --config Release -j %NUMBER_OF_PROCESSORS%

Step 6: Installing

cmake --build . --config Release --target install

You should now have the following directories: C:\mcsema, C:\remill.

Step 7: Running McSema

Add the McSema python package to Python

Make extra sure it only contains ASCII characters with no newlines! The following command should work fine under cmd:

echo|set /p="C:\mcsema\Lib\site-packages" > "C:\Python27\Lib\site-packages\mcsema.pth"

Install the libmagic DLL

pip install python-magic-bin

Update the PATH (cmd)

set PATH=%PATH%;C:\remill\bin;C:\mcsema\bin;C:\mcsema\Scripts

Update the PATH (PowerShell)


Additional Documentation

Getting help

If you are experiencing problems with McSema or just want to learn more and contribute, join the #binary-lifting channel of the Empire Hacking Slack. Alternatively, you can join our mailing list at or email us privately at


How do you pronounce McSema and where did the name come from

This is a hotly contested issue. We must explore the etymology of the name to find an answer. The "Mc" in McSema was originally a contraction of the words "Machine Code," and the "sema" is short for "semantics." At that time, McSema used LLVM's instruction decoder to take machine code bytes, and turn them into llvm::MCInst data structures. It is possible that "MC" in that case is pronounced em-see. Alas, even those who understand the origin of the name pronounce it as if it were related to America's favorite fast food joint.

Why do I need IDA Pro to use McSema

You don't! You can also use Binary Ninja or Dyninst to fill the role of IDA Pro; however, in our experiments, IDA Pro tends to be most reliable and both the product itself, and our scripts using it, have more person-years of development behind them.

What is Remill, and why does McSema need it

Remill is a library that McSema uses to lift individual machine code instructions to LLVM IR. You can think of McSema being to Remill as Clang is to LLVM. Remill's scope is small: it focuses on instruction semantics only, and it provides semantics for x86, x86-64, and AArch64 instruction semantics. McSema's scope is much bigger: it focuses on lifting entire programs. To do so, McSema must lift the individual instructions, but there's a lot more to lifting programs than just the instructions; there are code and data cross-references, segments, etc.

I'm a student and I'd like to contribute to McSema: how can I help

We would love to take you on as an intern to help improve McSema. We have several project ideas labelled intern project, as well as having smaller scale to-dos labelled under good first issue and help wanted on our issue tracker. You are not limited to those items: if you think of a great feature you want in McSema, let us know and we will sponsor it. Simply contact us on our Slack channel or via and let us know what you'd want to work on and why.