Making of physical systems that integrate software and hardware to sense their surroundings and respond to have gone through a lot of transformations with development in Artificial intelligence. These systems can blend digital and physical processes to form interactive experiences that seem more intuitive and predictive .This form of interactions between people and built-environment would enable space to be more responsive and adaptable. Such qualities would accommodate the environment for us to thrive, and help us have more meaningful connections.This thesis proposes a form of physical system that help us design an environment which uses machine learning to learn from us and to explore different possible interaction in order to provide appropriate physical responses by changing its shape through flexible move-able parts. The question as to what form of physicality would provide enough solution space is explored through physical computing. Different fabrication technologies have been explored in order to find the proper materials and assembly methods.As a result an interactive system which uses the proximity sensors as an input data is proposed. Then the data would be processed through a set of machine learning modules to provide feedback data for actuators. This system is responsible for developing a human-device relation that would learn and evolve through time. These characteristics of the system would create a unique user experience for each individual based on their interactions with built-environment.