top of page


Your study buddy & machine learning-enabled robotic arm.

Year - 2022

Project Duration - 1 month

In Collaboration With - Leo Melloul & Maria Asif


Many of us find it difficult to maintain focus while studying and keep track of time while doing so. Sometimes all we need is some positive reinforcement to keep us focused and some negative reinforcement to get us back on track when we get distracted. Aleria is here to help you! 

Aleria watches you through your laptop camera. If she recognises that you are focused, she works with you and carefully assembles an array of 9 marbles on a stage. Each marble represents 20 minutes of your time spent working in a state of focus. This allows you to visualise the time you spend working productively in a more tangible and satisfactory manner.

If Aleria see’s (and detects) that you are getting distracted, using your phone or getting groggy, she first threatens you to refocus by slowly tilting the stage of marbles that have been assembled. If Aleria see’s that you remain unfocused she gets upset and elevates/tilts the stage to an angle large enough to result in the marbles rolling off the stage and dropping below. The marbles land on an inclined ramp which have walls that carefully guide the marbles to a single collection point where Aleria can pick them up from and begin reassembling the array of marbles, in case you start focusing and being productive again. 

Our team was inspired to create a personified robot that accompanies you while you work and either rewards or punishes you on your focus and commitment to work.


Aleria watches you through your laptop’s webcam camera. The ‘inputs’ it is waiting to detect are your expressions. These expressions or visual instances include being focused, looking away, looking at your phone, laughing, getting drowsy/sleepy, or simply being away from your workstation.

Face OSC (built with Open Frameworks) is an application that allows Aleria to track a face, and detect pose, gestures, and expressions. The raw tracking data is transferred in real-time via a serial port. The application can pre-process images and videos by mapping an array of points on your face, thus allowing it to detect even minute muscle movements in your phase.


Our project’s physical outputs include three main elements, Aleria (the robotic arm), the base she sits on, and the stage on which she assembles the marbles. The first element, Aleria, is a robotic arm designed with 5 servos and 5 degrees of freedom. The robotic arm mechanism includes 3 main arms, a base that rotates, and a gripper that picks up and drops the marbles. The second element; the base, is slightly included and has guided walls that ensure that the marbles roll and end up in one precise position once dropped from the stage. The third and final element; the stage, included voids for 9 marbles to fit and a hinge mechanism that allows it to be tilted by the Aleria, for the marbles to be dropped off.

To bring Aleria to life, we use 3 large and high-torque servos (MG996R) and 2 smaller ones are low-torque (SG90) and the other being slightly higher torque (MG90S). The larger servos ensure that the robotic arm can sustain its weight and the smaller servos are positioned in the third arm which is responsible to pick up the marbles using a  one-to-one gear and gripper mechanism.


Controlling a robotic arm with 3 arms (linkages) and 5 degrees of freedom can be an extremely challenging task. To determine the exact end point (exact coordinates) that we would like the robotic arm’s gripper to go to. Using forward Kinematics and the angle of each motor as an input we can then determine the final coordinates  with basic trigonometry formulas.

For programming Forwards and Inverse kinematics, we used python as it is compatible with the visualisation tool mathplotlib.pyplot. We first started to code the Forwards kinematics code that will later be used as a verification tool for our Inverse Kinematics code. Then we coded the Inverse kinematics code. 

Back to site
bottom of page