While other modalities, such as vision, hearing, and touch are supported with well-known interaction techniques, tools, and widgets, this kind of support does not exist for the sense of smell. Most work on smell in HCI is technology driven, and often results in one-off applications to enhance multimedia, gaming, or virtual reality experiences. There are single attempts to use olfactory devices for health and well-being or for educational purposes. However, the success and impact of these prior works is limited. The biggest stumbling block is the lack of a standardized framework that can be reused in different scenarios and applications. Hence, there is a lack of replicability of olfactory experiences to advance the field of olfactory stimulation and perception. OWidgets addressed this challenge and created an empirically-based framework to enable the replication of olfactory experiences independent from the application and scent-delivery device. More specifically, we developed a software architecture alongside a fully controllable scent-delivery device to demonstrate our idea as proof-of-concept. We exhibited OWidgets for the first time at the worlds' largest consumer electronics show (CES) in Las Vegas in January 2018. The direct engagement with relevant stakeholder and potential customers lead to various follow-up activities around the commercialization routes beyond the lifespan of this project.