Skip to main content

Getting started with BeeltleboxCI and Vitis: Running FAST

This is a tutorial designed to show how to get started using BeetleboxCI and Vitis, in which we will cover the following:

  • Design - First we will set up a Vitis Vision project and make an accelerated fast kernel
  • Simulate - Second we will simulate this in Vitis by setting up a test bench
  • BeetleboxCI Integratation - Then we will integrate our Project and testbench with Beetlebox CI
  • Review - To finish we will compare the simulation results

The tutorial is designed not to be board specific. The accompanying git repository for this tutorial may be found here.

Tested Environment#

  • OS: Ubuntu 18.04
  • Vivado version: Vitis 2020.1
  • FPGA used: Zynq Ultrascale+ series ZCU104

Installation Guide:#

Design#

  1. Create a directory called workspace in the Vitis folder and change it to be the current working directory. Then we may launch Vitis. The following script assumes that Vitis was installed in the default directory. If it was not, then use the correct installation directory.
# Make workspacecd /tools/Xilinx/Vitis/2020.1 mkdir workspace cd /tools/Xilinx/Vitis/2020.1/workspacesource /tools/Xilinx/Vitis/2020.1/settings64.shvitis

Download the following from Xilins vitis embedded 2020.1 downloads , which may be found here.

ZCU104 Base 2020.1 and extract this to the vitis platform folder. Next the ZYNQMP common image, extract this to the workspace folder, then enter the folder and open up a terminal and run the following ./sdk.sh Extract to the following directory /tools/Xilinx/Vitis/2020.1/workspace/ now open a file browser in that location and copy the aarch64-xilinx-linux folder to the workspace folder

  1. Create a new project
    1. Go to File->Project->New
    2. Select the xilinx_zcu104_base_202010_1 board then click next
    3. Name the project test_fast_CI, click next
    4. Add the sysroot and image we downloaded from xilinx, then click next
    5. As shown below select the fast template from L2 folder in the vitis vision lib
    6. Click finish

vitisaddboard vitisnameproject vitissysroot vitisaddfast

  1. Click the hammer to build the sw-emulation project for the system then repeat this for the hardware and hardware-emu by selecting the different active builds vitishome

Simulate#

Right click on the project and select run configurations then select arguments and uncheck Automaticly update arguments. Then type the following in the arguments box ./small_image.png
Click apply then run vitissimulation

Ci integration#

Now we have verified our FAST locally, we will integrate it with BeetleboxCI. Using BeetleboxCI will allow us to regularly iterate on the FAST, so any changes such as additional modules can be automatically tested. It will also allow us to run simulations for long periods of time without using local computing resources, allowing us too focus on other parts of a system.

We need to add some scripts to help build and simulate on the CI. First we will add simple build scrips to build the SW-emulation, HW-emulation and Hardware. Then we will add Simulation scripts to simulate the FAST on the CI.

Make a file called build.sh copy the folling content and add it to the Emulation-HW folder in the project root. Then open a terminal where the file is located and change the permissions to using the following line chmod 777 build.sh

#!/bin/bash -fsource /tools/Xilinx/Vitis/2020.1/settings64.shcd Emulation-HW/make -j8 cleanmake -j8 incrementalretval=$?cd ..if [ "$retval" == 0 ]then     exit 0else     exit 1fi

Make a file called build.sh copy the folling content and add it to the Hardware folder in the project root. Then open a terminal where the file is located and change the permissions to using the following line chmod 777 build.sh

#!/bin/bash -fsource /tools/Xilinx/Vitis/2020.1/settings64.shcd Hardware/make -j8 cleanmake -j8 incrementalretval=$?cd ..if [ "$retval" == 0 ]then     exit 0else     exit 1fi

Make a file called build.sh copy the folling content and add it to the Emulation-SW folder in the project root. Then open a terminal where the file is located and change the permissions to using the following line chmod 777 build.sh

#!/bin/bash -fsource /tools/Xilinx/Vitis/2020.1/settings64.shcd Emulation-SW/make -j8 cleanmake -j8 incrementalretval=$?cd ..if [ "$retval" == 0 ]then     exit 0else     exit 1fi

Now to add simulation files to make simulation easy on the CI

Make a file called hw_emu.sh copy the folling content and add it to the project root folder. Then open a terminal where the file is located and change the permissions to using the following line chmod 777 hw_emu.sh

#!/bin/bash -fEmulation-HW/launch.shEmulation-HW/launch.expect

Make a launch.sh file with the following code to run the simulation. Then open a terminal where the file is located and change the permissions to using the following line chmod 777 launch.sh

#!/bin/bash -fsource /tools/Xilinx/Vitis/2020.1/settings64.shcd Emulation-HW/package/killall -9 qemu-system-aarch64killall pllauncherEMUDIR="$PWD/package"    if [ -d "$EMUDIR" ]      then      OUTDIR=$EMUDIR      cd $OUTDIR      echo "Current working dir $OUTDIR"      if [ -e "qemu_args.txt" ]        then        echo "Required emulation files like qemu_args exists"      else        echo "Required emulation files like qemu_args.txt doesn't exist. Please run from output dir"        exit 1      fi    else        if [ -e "qemu_args.txt" ]          then          echo "Required emulation files like qemu_args exists"          cd .          echo "Current working dir $PWD"        else          echo "Required emulation files like qemu_args.txt doesn't exist. Please run from output dir"          exit 2        fi    fi

Make a launch.expect file with the following code to read the simulation output. Then open a terminal where the file is located and change the permissions to using the following line chmod 777 launch.expect

#!/usr/bin/expect -f
set timeout -1
spawn /tools/Xilinx/Vitis/2020.1/bin/launch_emulator -device-family ultrascale -target hw_emu -qemu-args-file Emulation-HW/package/qemu_args.txt -pmc-args-file Emulation-HW/package/pmu_args.txt -pl-sim-dir Emulation-HW/package/sim/behav_waveform/xsim -sd-card-image Emulation-HW/package/sd_card.img -enable-prep-target -xtlm-log-state WAVEFORM_AND_LOG -forward-port 2222 22
expect "PetaLinux 2020.1 zynqmp-common-2020_1 ttyPS0"
expect "# "send "\n"
expect "# "send "\n"
expect "# "send "cd /mnt\n"
expect "# "send "cd sd-mmcblk0p1\n"
expect "# "send "source ./init.sh\n"
expect "# "send "export XCL_EMULATION_MODE=hw_emu"send "\n"
expect "# "send "export XILINX_XRT=/usr"send "\n"
expect "# "send "export XILINX_VITIS=/mnt/sd-mmcblk0p1/"send "\n"
expect "# "send "export LD_LIBRARY_PATH=/mnt/sd-mmcblk0p1/:/tmp"send "\n"
expect "# "send "./test_fast_CI small_image.png\n"
expect "# "send "exit\n"exit 0

Make a file called sw_emu.sh copy the folling content and add it to the project root folder. Then open a terminal where the file is located and change the permissions to using the following line chmod 777 sw_emu.sh

#!/bin/bash -fEmulation-SW/launch.shEmulation-SW/launch.expect

Make a launch.sh file with the following code to run the simulation. Then open a terminal where the file is located and change the permissions to using the following line chmod 777 launch.sh

#!/bin/bash -fsource /tools/Xilinx/Vitis/2020.1/settings64.shcd Emulation-SW/package/killall -9 qemu-system-aarch64killall pllauncherEMUDIR="$PWD/package"    if [ -d "$EMUDIR" ]      then      OUTDIR=$EMUDIR      cd $OUTDIR      echo "Current working dir $OUTDIR"      if [ -e "qemu_args.txt" ]        then        echo "Required emulation files like qemu_args exists"      else        echo "Required emulation files like qemu_args.txt doesn't exist. Please run from output dir"        exit 1      fi    else        if [ -e "qemu_args.txt" ]          then          echo "Required emulation files like qemu_args exists"          cd .          echo "Current working dir $PWD"        else          echo "Required emulation files like qemu_args.txt doesn't exist. Please run from output dir"          exit 2        fi    fi

Make a launch.expect file with the following code to read the simulation output. Then open a terminal where the file is located and change the permissions to using the following line chmod 777 launch.expect

#!/usr/bin/expect -f
set timeout -1
spawn /tools/Xilinx/Vitis/2020.1/bin/launch_emulator -device-family ultrascale -target sw_emu -qemu-args-file Emulation-SW/package/qemu_args.txt -pmc-args-file Emulation-SW/package/pmu_args.txt -sd-card-image Emulation-SW/package/sd_card.img -enable-prep-target
expect "PetaLinux 2020.1 zynqmp-common-2020_1 ttyPS0"
expect "# "send "\n"
expect "# "send "\n"
expect "# "send "cd /mnt\n"
expect "# "send "cd sd-mmcblk0p1\n"
expect "# "send "source ./init.sh\n"
expect "# "send "export XCL_EMULATION_MODE=sw_emu"send "\n"
expect "# "send "export XILINX_XRT=/usr"send "\n"
expect "# "send "export XILINX_VITIS=/mnt/sd-mmcblk0p1/"send "\n"
expect "# "send "export LD_LIBRARY_PATH=/mnt/sd-mmcblk0p1/:/tmp"send "\n"
expect "# "send "./test_fast_CI  small_image.png\n"
expect "# "send "exit\n"exit 0

Make a folder called .bbx in the same folder root directory and add a file called config.yaml with the following code:

runners: AWS-runner:  image: ubuntu-vitis-2020-1
jobs:    build-SW-emu:  runner: AWS-runner  type:   build: software  current_working_directory: /tools/Xilinx/Vitis/2020.1/workspace/test_fast_CI   input:    artifact:     - xilinx_zcu104_base_202010_1.zip    - aarch64-xilinx-linux.zip    - xilinx-zynqmp-common-v2020.1.zip  output:   artifact:    - Emulation-SW    - sw_emu.sh  steps:   - run:      name: Download files      command: |       git clone https://github.com/mwijeyasinghebbx/test_images       mv test_images/* /tools/Xilinx/Vitis/2020.1/platforms/xilinx_zcu104_base_202010_1/sw/xilinx_zcu104_base_202010_1/xrt/image/      type: miscellaneous    - run:      name: Run Software Emulation Build      command: Emulation-SW/build.sh      type: sw-emulation     test-sw-emulation:  runner: AWS-runner  depends:   - build-SW-emu  type:   test: sw-emulation  current_working_directory: /tools/Xilinx/Vitis/2020.1/workspace/test_fast_CI   input:    artifact:     - build-SW-emu    - xilinx_zcu104_base_202010_1.zip  steps:   - run:      name: Run Software Emulation      command: ./sw_emu.sh   build-HW-emu:  runner: AWS-runner  type:   build: hw-emulation  current_working_directory: /tools/Xilinx/Vitis/2020.1/workspace/test_fast_CI   output:   artifact:    - Emulation-HW/build.sh    - hw_emu.sh    - Emulation-HW/launch.sh    - Emulation-HW/launch.expect    - Emulation-HW/package  input:    artifact:     - xilinx_zcu104_base_202010_1.zip    - aarch64-xilinx-linux.zip    - xilinx-zynqmp-common-v2020.1.zip  steps:   - run:      name: Download files      command: |       git clone https://github.com/mwijeyasinghebbx/test_images       mv test_images/* /tools/Xilinx/Vitis/2020.1/platforms/xilinx_zcu104_base_202010_1/sw/xilinx_zcu104_base_202010_1/xrt/image/      type: miscellaneous     - run:      name: Run Hardware Emulation Build      command: Emulation-HW/build.sh      type: hw-emulation   test-hw-emulation:  runner: AWS-runner  depends:   - build-HW-emu  type:   test: hw-emulation  current_working_directory: /tools/Xilinx/Vitis/2020.1/workspace/test_fast_CI   input:    artifact:     - build-HW-emu    - xilinx_zcu104_base_202010_1.zip  steps:   - run:      name: Hardware Emulation      command: ./hw_emu.sh  build-HW:  runner: AWS-runner  type:   build: hardware  current_working_directory: /tools/Xilinx/Vitis/2020.1/workspace/test_fast_CI   input:    artifact:     - xilinx_zcu104_base_202010_1.zip    - aarch64-xilinx-linux.zip    - xilinx-zynqmp-common-v2020.1.zip  steps:    - run:      name: Build HW      command: Hardware/build.sh      type: Hardware workflows: complete-build-test:  jobs:   - build-SW-emu   - test-sw-emulation   - build-HW-emu   - test-hw-emulation   - build-HW

The .bbx/config.yaml is the configuration file that BeetleboxCI will use to automate this application. Each project in BeetleboxCI consists of a pipeline, which is made of different workflows. Workflows are a series of jobs that are to be executed are configurable by the user in the configuration file under the workflows section. These jobs are then specified under the jobsand in our case we have a single job called build_run_sim. We specify the runner to be used as AWS-runner, which uses the machine image ubuntu-vitis-2020-1. This machine image is specifically designed to run Vitis and Vivado tools and provides 4 CPUS and 32GB of RAM. We also specify that the type is a build:hardware. This is an optional setting, but can be use to help identify what the purpose of this build is. We then identify the artifacts that are to be stored in out artifact store. Finally, we provide the steps that the job is to perform. We need to run commands in a bash shell, so we use the run command step. We provide name and the individual commands to be run in our bash shell through command.

We need to ensure that only source code is committed to our Github repository and not large build files. To do so, we make a .gitignore file in the project root directory and add the following code:

!.bbx/*.yaml ######################Custom User Ignores######################Ignore any folders that contains large fileslarge_file_folder/_ide/Emulation-SW/krnl_fast.buildEmulation-SW/*.xclbinEmulation-SW/*.buildEmulation-SW/.XilEmulation-SW/srcEmulation-SW/package.buildEmulation-SW/packageEmulation-SW/ip_cacheHardware/krnl_fast.buildHardware/*.xclbinHardware/.XilHardware/srcHardware/package.buildHardware/packageHardware/ip_cacheEmulation-HW/krnl_fast.buildEmulation-HW/*.xclbinEmulation-HW/.XilEmulation-HW/srcEmulation-HW/package.buildEmulation-HW/packageEmulation-HW/ip_cache
#Compiled files*.xclbin*.info*.link_summary*.xclbin.sh*.json*.mdb*.package_summary*.imghw_tester

#Ignore any files used for debugging*.debug*.prj.settings.project.cproject*.png*.jou
######################Default Python ignores###################### Byte-compiled / optimized / DLL files__pycache__/*.py[cod]*$py.class
# C extensions*.so
# Distribution / packaging.Pythonenv/build/develop-eggs/dist/downloads/eggs/.eggs/lib64/parts/sdist/var/wheels/*.egg-info/.installed.cfg*.egg
# PyInstaller#  Usually these files are written by a python script from a template#  before PyInstaller builds the exe, so as to inject date/other infos into it.*.manifest*.spec
# Installer logspip-log.txtpip-delete-this-directory.txt
# Unit test / coverage reportshtmlcov/.tox/.coverage.coverage.*.cachenosetests.xmlcoverage.xml*.cover.hypothesis/
# Translations*.mo*.pot
# Django stuff:*.loglocal_settings.py
# Flask stuff:instance/.webassets-cache
# Scrapy stuff:.scrapy
# Sphinx documentationdocs/_build/
# PyBuildertarget/
# Jupyter Notebook.ipynb_checkpoints
# pyenv.python-version
# celery beat schedule filecelerybeat-schedule
# SageMath parsed files*.sage.py
# dotenv.env
# virtualenv.venvvenv/ENV/
# Spyder project settings.spyderproject.spyproject
# Rope project settings.ropeproject
# mkdocs documentation/site
# mypy.mypy_cache/

#Ignore compiled code*.pyc
#Ignore files generated at runtime*.out

######################Default C ignores###################### Prerequisites*.d
# Object files*.o*.ko*.obj*.elf
# Linker output*.ilk*.map*.exp
# Precompiled Headers*.gch*.pch
# Libraries*.lib*.a*.la*.lo
# Shared objects (inc. Windows DLLs)*.dll*.so*.so.**.dylib
# Executables*.exe*.out*.app*.i*86*.x86_64*.hex
# Debug files*.dSYM/*.su*.idb*.pdb
# Kernel Module Compile Results*.mod**.cmd.tmp_versions/modules.orderModule.symversMkfile.olddkms.conf
######################Default C++ ignores###################### Prerequisites*.d
# Compiled Object files*.slo*.lo*.o*.obj
# Precompiled Headers*.gch*.pch
# Compiled Dynamic libraries*.so*.dylib*.dll
# Fortran module files*.mod*.smod
# Compiled Static libraries*.lai*.la*.a*.lib
# Executables*.exe*.out*.app
  1. Create a new Github Repository. Name the repository test_fast_CI. and upload the project to it.
  2. Now log into your BeetleboxCI account here.
  3. You will see the following screen, click Artifact Store then Artifact Store Upload Atifact beetleboxci
  4. Upload the platform that we downloaded from Xilinx
  5. Next upload the sysroot and choose library
  6. Then upload the common image beetleboxciartifact
  1. Fill in the details as shown below then click Submit. You will be presented with your personal github webhook and deploy key information. fast_ci_add_project

  2. Leave this information open and head to your git repo. From here go to the project settings, then click Webhooks then click Add webhook.

  3. Copy the payload url from the CI to the github settings then change the content type to application/json, then copy over the secret from the ci to Github as shown below, then click Add webhook.

gitwebhook

  1. We should then see a green tick to say the git repo has connected correctly.
  2. Now in the Github Settings, navigate to the Deploy keys and click Add deploy keys.
  3. Title the deploy key testci and copy and paste the deploy key from the CI into the key input as shown below. Then click finish.

gitdeploykey

  1. Now to run the project on the CI, we must commit to the git repo. This will trigger the webhook and the CI will start building. For this example we can simply edit the readme.md file.

  2. In the project repo there will be a button saying add readme file click this and follow the instruction.

  3. When you have committed the readme to the project the ci will pick this up and start building. When the build is finished the ci will tell you as shown below.

cipassfast

Review#

  1. Open the project in the CI
  2. Navigate to the logs of the test-se-emulation
  3. Click logs
  4. Compare this to the simulation output of the vitis terminal