16 tools were achieved with different levels of maturity:some of them are almost market-ready, others are protypes that require further R&D before being user-ready. They include the library of dance movements and its web-based interface (WhoLoDancE movement library and annotator), its extension into an educational platform at the service of dance teaching (WhoLoDancE educational platform) and different tools for data curation, such as Movement quality annotation by comparison tool and Segmentation tool, besides a Low-end VR platform allowing for visualisation of dance data in the form of a 3D avatar on a common smartphone. A set of software libraries for music and sound analysis, utilised in other applications, include software for real-time extraction of movement qualities (Movement analysis software library), automatic annotation of dance data with movement qualities (Data-driven movement quality extraction software library) and analysis of physical and expressive qualities of sound (Tools for music analysis). Project outcomes also entail user-ready tools for movement analysis, such as a Search and similarity engine and its derived applications, able to analyse the representation of a movement and identify the most similar movements in the library; the query movement can be taken from the motion repository (Similarity search web-based application), recorded through low-cost sensors (Movement sketching tool) or through a simple smartphone or tablet camera (Real-time mobile movement search application); this same engine has also been employed to develop a smartphone-based game (Real-time motion-based collaborative mobile game) to provide practitioners with a score indication of the closeness of their performance to a given one, to help improve their performance. The Blending engine, has been specifically designed for choreographic creation: the application allows to assemble movements from the repository in a temporal sequence or superpose parts of different movements to generate new, cross-genre movements and choreographies, and visualise them in the form of an avatar before starting to practice them in the studio. Choreomorphy tool enables dancers, through the use of a motion capture suit, to visualise themselves into different avatars focusing on traces and volumetric shapes, the and Sonification tool converts expressive movement qualities into sound, providing real-time, responsive feedback on the movement without causing distraction. Lastly, the HoloLens experience, conceived as a mixed-reality, didactic application, enables a student to observe a piece performed by a dance master, imitate it and receive image and sound real-time feedback through the Microsoft HoloLens, also allowing for an additional teacher’s view with the support of a second HoloLens. Upon release of their user-ready version, the latest months of the project have been dedicated to a thorough final round of evaluation of the tools by a team of selected experts from the dance and 3D animation field in terms of usability, effectiveness, accessibility and potential added value for their everyday work, which have progressively guided the conclusive fine-tuning of the tools to their current version.