Kortexino Coding Challenges

Learn how to program Kortexino Bot via Arduino in these fun challenges

how to create a site for free

Challenge 0: The "Zerobot"

Your starting point into Kortexino programming

Mobirise

Download the zerobot code and library here. Open it in the Arduino IDE and inspect the file main zerobot file shown in the image to the left. Don't worry about the other files - these only contain the code for the bluetooth and robot communication, which will always be the same. 

The zerobot file shown to the left however does not do much. It just starts the Bluetooth communication and initializes the control of the motors built into the Kortexino bot.

In the following challenges, you are tasked to extend the zerobot file to generate more interesting robot behaviors.

  • 3D FACE POSITION AND ORIENTATION  - The AI in Kortexino precisely measures face distance and the face angle relative to device (yaw, pitch and roll).
  • FACIAL EXPRESSIONS - Kortexino measures how much you smile, kiss, put your tongue out, open your mouth, wink your right or left eye, pull your brows up or down, show your teeth, widen your eyes, frown, or close both eyes.
  • EMOTION DETECTION - You can use this information in your own project to detect emotions that are related to these facial expressions.
  • REQUIREMENTS - Due to the implemented high level AI algorithms, Kortexino requires iPhoneX or later.

Challenge 1: The "Shybot"

Your first simple project

Program a shy robot that always looks away if you try to look into its "eyes"! In this first challenge, modify the zerobot code that you downloaded in Challenge 0 to include a simple behavior: If the Kortexino app detects a face, the Kortexino robot should rotate away from your face, if the app does not detect a face the rotation shoud stop.


For this you will need to read out that the following variables:

kortexino.face_detected

kortexino.yaw_to_bot


And you need to use the following commands:

robot.turn_right(char motor_speed)

robot.turn_left(char motor_speed)


Look at the API Reference to find out more about these commands.

Tip: You should set the motor speed to make the robot turn smoothly, for example using:

robot.turn_right(20)

robot.turn_left(20)


If you get stuck, you can see the solution here.

Challenge 2: The "Starebot"

Slowly getting more complex

Program a robot that always stares at you - meaning that it always turns around to directly point towards your face and follows your face movements! In this second challenge, modify your shybot code to enable a more interesting behavior: If the Kortexino app detects a face, the Kortexino robot should rotate towards your face.


For this you will need to use the same variables and functions as in the previous challenge:

kortexino.face_detected

kortexino.yaw_to_bot

robot.turn_right(char motor_speed)

robot.turn_left(char motor_speed)


Tip: You should set the motor speed depending on the yaw angle to control turning smoothly:


If you get stuck, you can see the solution here.

Challenge 3: The "Ignorebot"

Using other sensor inputs

Program a robot that is very ignorant, making an exact half turn (180°) if it encounters a face ! In this challenge, modify your code to include a specific robot movement: If the Kortexino app detects a face, the Kortexino robot should rotate exactly by 180°.


In addition to some of the variables that you used above, you will need the variable  

kortexino.compass

and you might need to use the function

kortexino.read_data()


Look at the API Reference to find out more about these commands.

Tip: Reading out the compass can help you to make an exact 180 degree turn. You might use the compass readings together with your own variables. make sure you use the kortexino.read_data() command to update sensor readings, if necessary.


If you get stuck, you can see the solution here.

Challenge 4: The "Searchbot"

Programming two robot behavior states

Program a robot that always stares at you and searches for your face by rotating! In this challenge, modify the starebot code to include an additional behavior: If the Kortexino app does not detect a face, the Kortexino robot should rotate to search for the face. An optimal search strategy that can anticipate, in which direction the face is more likely found would be beneficial.


For this you will need to use the same variables and functions as in the challenge 1 and 2:

kortexino.face_detected

kortexino.yaw_to_bot

robot.turn_right(char motor_speed)

robot.turn_left(char motor_speed)


Tip: A efficient search strategy can be implemented by choosing the search roation direction based on the last detected yaw angle of the face.


If you get stuck, you can see the solution here.

Challenge 5: The "Disgustbot"

Adding a third robot behavior state

Program a robot that is disgusted by you if you stick out your tongue! In this challenge, extend the Searchbot code to add a third behavior: If you stick out your tongue, the robot should turn around by 180°.


In addition to some of the variables that you used above, you will need this variable  

kortexino.tongue_out


Look at the API Reference to find out more about this variable.


If you get stuck, you can see the solution here.

Challenge 6: The "Distancebot"

Advanced behavior

Program a robot that keeps a fixed distance to your face! In this challenge, extend the Searchbot code to automatically adjust the distance between the bot and your face.


In addition to some of the variables and commands that you used above, you will need this variable:

kortexino.face_distance


and these commands:

robot.move_forward(char motor_speed)

robot.move_backward(char motor_speed)


Look at the API Reference to find out more about these commands.


If you get stuck, you can see the solution here.

Challenge 7: The "Emotibot"

Combine what you have learned to make an emotion sensing smart robot

Program a smart robot that explores its surroundings and reacts to your emotions! In this challenge, extend the Distancebot code to change the robot behavior if specific facial expressions are detected.

Use what you leared so far, and extend the behaviors by adding additional variables that represent facial expressions, such as:

kortexino.kiss 


Look at the API Reference to find out more about available variables.

If you get stuck, you can see one example solution here.