WORKSHOP ON TINYML

What will you learn

Embedded machine learning, also known as TinyML, is the field of machine learning when applied to embedded systems.
You'll learn
  • How to collect high-frequency data from real sensors
  • Use signal processing to clean up data
  • Build a neural network classifier
  • How to deploy your model back to a device.

  • 1. Continuous motion recognition
  • Use machine learning to build a gesture recognition system that runs on a microcontroller.

  • 2. Responding to your voice
  • Use machine learning to build a system that can recognize audible events, particularly your voice through audio classification.
  • The system you create will work similarly to "Hey Siri" or "OK, Google" and is able to recognize keywords or other audible events, even in the presence of other background noise or background chatter.

  • 3. Recognize sounds from audio
  • Use machine learning to build a system that can recognize when a particular sound is happening—a task known as audio classification
  • The system you create will be able to recognize the sound of water running from a faucet, even in the presence of other background noise.

  • 4. Adding sight to your sensors
  • Use machine learning to build a system that can recognize objects in your house through a camera - a task known as image classification - connected to a microcontroller.

  • Hardware Requirements

    To follow along with the hands-on portions of the workshop, you MUST be using the following hardware,which will be provided at the workshop:
    • TI CC1352P LaunchPad
    • BOOSTXL-SENSORS
    • CC3200AUDBOOST
    • Espressif ESP-EYE (ESP32)
    • Jumper wires
    • Micro USB cable (e.g. USB-A to Micro USB or USB-C to Micro USB).
    Charge-only Micro USB cables will NOT work!

    Software/Service Installations

    Attendees only need some very basic knowledge of command line development and understanding of 101-level micro-controller and sensor (you can get by with copying-and-pasting provided code!).
    To make the most of this workshop, if you want to follow along you MUST come prepared with the following pre-installed
    • Visual Studio Code • PlatformIO Extension for VS Code
    • Node.js v12 or later (we recommend the latest LTS version)
    • If using Windows, install the "Tools for Native Modules" (part of the installation process).
    • Edge Impulse CLI with npm install -g edge-impulse-cli
    • Create a free Edge Impulse Studio account.

    To ensure that the workshop runs smoothly, please make sure that you bring the following prerequisites: Here are some useful links for you:
  • Create a free Edge Impulse Studio account.

  • Install Edge Impulse CLI

  • In order to install Edge Impulse CLI , you need:
    a.Install Python 3 on your host computer
    b.Install Node.js v14 or higher on your host computer.

  • Some systems might need to install the latest version of Visual Studio including the "Desktop development with C++" workload.

  • We have uploaded installation guide and walkthrough for all the exercises.Kindly go through them.

  • If you have any questions or concerns, please don't hesitate to reach out to us. Join our discord server

  • Who should attend

    Students of any major and any level of experience are welcome to participate.

    Duration

    2 Days - The duration of this workshop will be two consecutive days Each day we will be presenting one category of machine learning models.

    Schedule

  • April 20th - Continuous motion recognition; Recognize sounds from audio
  • April 21th - Responding to your voice; Adding sight to your sensors

  • Day

    Time

    Event

    Day 1 (Friday 04/19) 3pm-6pm Join our online session for assistance with installation queries.
    (Meeting link will be shared via email)
    Day 2 (Satuday 04/20) 09am-10am Introduction
    Day2 10am -12pm Hands on lab–session1 Continous metion recognition
    Day2 1pm-2pm Introduction
    Day2 2pm-4pm Hands on lab–session2 Recognize sounds from audio and Responding to your voice
    Day3 (Sunday 04/21) 09-10am Introduction
    Day3 10am -12pm Hands on lab–session1 Adding sight to your sensors - Part 1
    Day3 1-2pm Introduction
    Day3 2-4pm Hands on lab–session2 Adding sight to your sensors - Part 2

    Where

    In-person at the University of Texas at Dallas. Engineering and Computer Science Building South (ECSS)

    How to Participate:

    Register on website. Hurry up limited seats only!

    Cost of participation

  • 100USD for all days, four project workshop.
  • 60 USD for students
  • 40 USD for IEEE/UTD/UNT/Tylor - Contact team for promo code for special discount

  • Certification Policy

    Certificate of Merit for all the workshop participants.

    Connect with us

  • REGISTRATIONS NOW OPEN!                                                                                                                                                                                 Register Here


  • <