// FMP & Thesis – #5


Round 1 of prototyping began with a trip down Walworth Road to visit a variety of charity shops. Spending roughly £10 on materials, I purchased several second hand items and some cheap materials to work with for the day.



Taking apart some items, I utilised a sponge and a portable USB fan to prototype the heart device, creating a spongy, bulky wearable that would give a simple vibration feedback. The sponge provided some good inspiration regarding the sense of touch that would be playful for this item – soft, squishy and it felt as if the user needed to massage the device to calm it down. Utilising the fan’s motor, I roleplayed with the device as if the motor would turn on according to particular sensors such as heart rate, and would turn off/become slower as you massage the spongey parts of the device.




Further sketches developed the shape of the heart device further, considering the use of materials and a fabric cover to make it seem more friendly, gentle. A circle in the middle for the motor/servo which could spin some form of graphic pattern (like a yoyo being played) to indicate its function. This eventually developed into the idea that the heart could detect when the user is feeling guilty, acting as if it was your conscience.




Developing the sketch further and considering the technical feasibility, the device’s shape confirmed to be an upside down triable with rounded edges, almost heart-shaped yet not quite. I could realise its function with a simple heart rate sensor and a servo attached inside the spongy parts, and perhaps create some form of meter/measurement for the circular bit. The device could be stuck on to the user’s shirt along with attaching the sensor, and claims that it can detect guilt when all it does is simply give arbitrary judgements based on your heart rate. Whilst it claims that it would calm down if you massage the spongey bits, this could be false and it simply calms down after a certain amount of time. This idea was developed whilst looking at the ideas of smart homes and home automation such as NEST.

The Nest Learning Thermostat by Nest Labs is an electronic, programmable, and self-learning Wi-Fi-enabled thermostat that optimizes heating and cooling of homes and businesses to conserve energy.[1] It is based on a machine learning algorithm: For the first weeks users have to regulate the thermostat in order to provide the reference data set. Nest can then learn people’s schedule, at which temperature they are used to and when. Using built-in sensors and phones’ locations it can shift into energy saving mode when it realizes nobody is at home.[2]

Home automation or smart home[1] (also known as domotics or domotica) is the residential extension of building automation and involves the control and automation of lighting, heating (such as smart thermostats), ventilation, air conditioning (HVAC), and security, as well as home appliances such as washer/dryers, ovens or refrigerators/freezers that use WiFi for remote monitoring. Modern systems generally consist of switches and sensors connected to a central hub sometimes called a “gateway” from which the system is controlled with a user interface that is interacted either with a wall-mounted terminal, mobile phone software, tablet computer or a web interface, often but not always via internet cloud services.

Just what exactly was “smart”? Can it really be clever enough to quantify and detect emotions and feelings such as guilt? What sort of arbitrary parameters does it consider when it says it adapts to a user? Can a system of sensors, numbers, and a central hub really be objective and neutral enough to say when a person is feeling guilty? Falling on this line of inquiry, I continued to design the heart device whilst giving it the following description.

“Pulse” is a smart wearable that reacts to guilt and calms down upon being gently squeezed.



For the Hand object, I found inspiration from taking apart a small, handheld toy which would push out a doll’s head when pressed, accompanied by a comedic bouncing sound effect. The motion of holding down the button for the spring motion to extend helped me to think about how I wanted this device to feel like – an extension of your hand/arm’s basic functions. I also tried to take apart a robot dog toy for parts, but failed due to the rusted screws.






Developing off the idea of feeling data with the hand, the pointing motion inspired a new function – what if it could perform cloud storage simply by pointing towards the clouds in the sky? This eventually gave form to a giant, foam hand with an extending mechanism. The user can plug in their storage devices, switch between an upload/download mode and then hold down the button for the hand to spring out, pointing towards the clouds for the upload/download process to happen. Through this idea, I wanted to device to be looking at the subject of cloud storage, and explore the nature of what “cloud” actually means.

Cloud computing and storage solutions provide users and enterprises with various capabilities to store and process their data in third-party data centers[3] that may be located far from the user–ranging in distance from across a city to across the world. Cloud computing relies on sharing of resources to achieve coherence and economy of scale, similar to a utility (like the electricity grid) over an electricity network.

Is it really floating in the sky? Is it really white, weightless and ephemereal? How does the usage and imagery of the word “cloud” obscure the fact that one is storing/downloading files form a third party location perhaps far, far, away from your current location? Does it hide the fact that your data is being stored somewhere else, in a computer under someone else’s control? Does it obscure the privacy concerns regarding your files within the cloud? By directing the focus back towards the physical cloud, I was hoping to bring out some of these questions out in the open.

“Point” is file sharing device that can upload and download files by pointing at clouds in the sky.




Whilst the prototyping session helped to think about two of the objects, the last one was still meandering between concept, form and message. I had ideas regarding headsets, visors, surveillance-based issues and VR, but nothing in particular came to form one solid idea. Some interesting ideas that popped in during this time was the idea of shared VR experiences, or fantasising about different types if realities that could be explored (SR – Super Reality, UR – Ultra Reality, etc.) and playing around that.




Eventually doing some research back into the nature of VR, I decided to look into stereoscopic vision, the 3d effect and immersion aspects of VR. Just what exactly was the “reality” part of it? Is it truly another world? Or is it really just a visual trick that puts you into immersion? Some ideas emerged by seeing how VR can be affected if you only had one eye – how would that affect your immersion?  Doing some quick sketches, I thought about creating modular parts for a VR headset that was specific to the left and right eyes, allowing the user to customise according to their own liking. I then decided to prototype by slicing off half a cardboard headset.





Using a free promotional headset I took from an exhibition several months ago, I sliced it in half and stuck an Arduino with a Touchscreen on to it, exploring how it could potentially work. Afterwards I decided against this idea, as the form of the half-headset didn’t look appealing nor could be understood easily. However, the experience of one eye seeing the real world and one eye seeing a virtual world developed further in the sketches, and I decided to do a cardboard headset based around this idea.

“Peek” is a headset that traverses the virtual world and real world through the left and right eyes.