The color of the day

One of the themes I am exploring with my clock is to eliminate the need for clock hands and communicate the time through gradient of colors. One of the first experiments I came up with was to eliminate the clock hands to communicate how my day time is remaining.

For that, I set the following goal for my sketch.

1) Determine dawn, dusk, sunrise, sunset timings.

2) make a gradient move across the screen based on the amount of sun-time remaining.

3) Make it flexible so that swapping colors will be easy to experiment with.

4) All calculations are based off actual screen dimensions.

5) All the transforms are being done with HSVLuv.

The clock can be found running at:

An animated version of the sketch can be found here:

Finals: Let there be light!

The proposed lamp will change color temperature across the day in a pattern which follows the natural path of the sun. It will be warmer in the mornings and evenings and cool white during the day. The brightness of the surroundings will change the brightness of the lamp.

Paper.Light.28 4.png

Currently, the circuit to light up the lamp and change its color temperature is working as seen in the video below:

Wandering on the 5th floor

The 5th floor is a pretty onerous floor with overbearing walls and claustrophobic passages. Trying to find an interesting lighting moment was pretty bleak but I found that some of the open spaces had interesting spot lights. Now, spot lights are not something that would strike me as an addition inside a house but I liked how they create a sense of drama and highlight in an otherwise boring space. the light shimmering off the curtain was also an interesting effect to observe. The interplay of light and cloth was pretty awesome to observe and while we do not think of matching textures to color, it adds an interesting effect to play around with. With small bright LEDs appearing on the horizon and getting affordable, are they going to be used more in home lighting?

Average light: 200 LUX

The walls of the corridor need more diffused light and a warmer color.


Critical objects: Final proposal

Team: Arnab Chakravarty, Gilad Dor.


For our final, we are working the topic of more-than-human centered design. As we collectively face environmental degradation in the context of the Anthropocene, the question we are asking ourselves is whether we can build systems that include non-human voices and agendas. Can we design everyday technologies to amplify our connection with the non-human instead of hiding them away? Is there any space for designing objects of technology with principles of empathy and kindness instead of efficiency and usability? Here, we would like to anchor our project in the principles of Para-functionality as defined by Anthony Dunne in the book “Hertzian tales“ and the idea of eccentric engineering by Tega-brain.

The form of our final project is tentatively titled ‘Plant radio‘ where the functionality of a radio and a plant is intertwined into its form and function. The plant is embedded with moisture, light and humidity sensors and the users have a big dial with which they can change the radio station. The radio station plays as expected when the plant is being kept well. However, if the plant is not being taken care of it will randomly change the radio station to another station that is close to it’s mood (sadness, hurt, melancholy). The user and the plant engage in a subtle tug of war which doesn’t have a fix apart from taking care of the plant. The aim is to make the viewers and users think about the nature of out computationally mediated interfaces.


  • Soil/moisture/light sensors -> Arduino -> VS0153 mp3 board -> FM transmitter


  • Attribute: Household object

  • Device: Transmission

  • Mood: Eccentric


  • Tabletop object with 1 to 1 interaction between the user and the device.

  • No acrylic

This land is your land

For the body politics assignment for critical objects, I and Sukanya teamed up to work on it together. Our initial impulse was to work with the notion of body, movement and permission in geographical spaces. We came up with a few concepts which revolved around:

  • Lack of free movement across geographical areas for people based on the country they come from.

  • Race, income levels, gender, sexual orientation and other parameters as barriers for access in these spaces.

We initially brainstormed ideas where we thought of combining a wearable device that allows you movement only in areas that you can go to but after presenting these ideas to other people and hearing their critiques, we decided to focus on working with the idea of movement based on where you are born. While the issue of movement is intersectional and spans both the global and the local, the passport is the universal method of encompassing control on individual movement that we have all collectively agreed upon. It’s also interesting to observe that while passports were introduced in its current form during World War 1 for security reasons, they were quickly assimilated as standard procedure. The passport is such a pervasive document that most of us take it for granted and have never questioned it’s existence in our daily lives.

Also, one of the first things that struck us while we moved to a western country is the absolute lack of awareness of people who have American/ EU passports about the sheer difficulty of getting visas and being able to move if you do not have the passport of the right country. One other thing that we observed was the conflation of travel with self-care, personal development and self actualization and how it presents a very privileged narrative as a universal truth.

Screenshot 2019-04-22 02.39.22.png
Screenshot 2019-04-22 02.40.53.png
Screenshot 2019-04-22 02.42.06.png

To investigate this further, Sukanya wrote a script that would compile the dataset of all countries each nation had access to without a visa/prior permission and then rank them. Instead of directly looking at the number of countries, the script calculated the land mass of these countries and then created a ranking of them. Comparing biomass was more interesting as the number of countries could be misleading and hey, aren’t boundaries artificial constructs anyways? For the final form, we finalized on the following as our framework:

Attributes: Wearable

Device: deadpan, dry

Mood: Mundane everyday

We centered on using an eyepiece as the object of focus. The metaphor of being limited in seeing the world resonated very strongly with us and we were decided to use that in our form. We did a few tests to see the kind of blocking we wanted from our object.

Blocking the lens in a circular pattern made the image sharper. Totally opposite to what we were going for.

Blocking the lens in a circular pattern made the image sharper. Totally opposite to what we were going for.

Blocking the sunglass from the bottom was better than the top as the head instinctively moved down to focus the eye on the transparent part.

Blocking the sunglass from the bottom was better than the top as the head instinctively moved down to focus the eye on the transparent part.

For the form of the final object, we went back and forth on it. Our initial instinct was to make something sculptural inspired by the images below:

Screenshot 2019-04-22 03.14.47.png
Screenshot 2019-04-22 03.17.36.png
Screenshot 2019-04-22 03.16.35.png

We tried sketching variations of this theme but our framework helped us reorient us. We decided to make the frame into a mundane, everyday sunglass which mirrors the acceptance of the passport in our day to day lives. To make the frame, we tried 2 approaches:

Approach 1) 3D print the frame.

Approach 2) Use frames that have been bought previously.

We did not fix on an approach as we wanted to use resin for our glasses. We spent a lot of time experimenting to get the mold to set without leaking.

However, the end result was very unsatisfactory. So, we decided to do the lens cut out of acrylic in a 3D printed body. Sukanya wrote another script that calculated the precise area of the lens so that we could divide them in accurate percentages based on the country’s access to landmass. For the first batch, we made a set of 3 countries that covered the spectrum of the ranking. We faced some issues with the 3D printer but nothing that sand paper and files could not solve. Some images on our process below:

FInal frame 3d print

FInal frame 3d print

Testing the fit of the lens

Testing the fit of the lens

Trying out variations for perfect fit

Trying out variations for perfect fit

Gluing and sticking the frames

Gluing and sticking the frames

After multiple trials and errors, we could finally nail down the fit and size for the final set of sunglasses below. Can you guess the countries?


Animistic design and bringing toys to life

For a class assignment, we were asked to choose a research paper from Interaction Design and Children archive and summarize it. I was specifically looking for papers which talk about the design of the ‘form’ of smart and networked toys and I was lucky to come across this paper titled “When toys come to life: considering the internet of toys from an animistic design perspective“ Link

I have heard of the term ‘Animistic design‘ being used to define interactions between humans and non-humans. I believe that the Human centered design methodologies are limited in how they design with the non-human actors (specifically AI) and that we need to think of ecologies of use where every non-human actor is considered to be at par with humans. This article goes into more depth.

Coming back to the paper, it revolved around the design qualities of connected toys in a complex setting of people, objects and data that react to each other. The researchers specifically designed 3 toy concepts which were then analysed from the perspective of animistic design, questioning whether and how the design qualities would foster or inhibit 1) agency, 2) embodiment, 3) a certain ecology of objects and subjects, and 4) uncertainty.

Screenshot 2019-04-05 07.42.11.png
Screenshot 2019-04-05 07.42.03.png
Screenshot 2019-04-05 07.41.57.png

Concept A) Was a robot that the children could interact with as a play-mate by programming the robot to move in different directions. LEDs on the robot would give feedback and the children could also personalize the robot using Lego bricks.

Concept B) Was a figurine which could interact with many objects (such as cards) and create a virtual avatar and interact with it on an ipad. The figurine acted as an intermediate for the child to interact with the digital world.

Concept C) Was a smart bracelet which could connect to smart devices. The child could customize and attach different pins and each combination of pins had a unique effect in a game. The bracelet also stored the child’s data and preferences so they could continue play at a different location.

Looking at the 3 designs, the authors conclude that the bracelet allowed for divergent interactions, created less dependence on instructions, gave rise to more autonomy and had more fluid boundaries of use which encouraged children to display more agency. The anthropomorphic qualities of the toys created expectations of consistent use, actualizing certainty and predictability. The authors also conclude that the approach requires more grounding in user research for these findings to be validated but there is a rich space for exploration of divergent forms that do not stick to the usual approach of anthropomorphizing connected devices. This was an eye-opener for me and will make me reconsider the product form that we are working on for our finals.

What does the world think of Cozmo?

With Cozmo, it was love at first sight. SO, when the assignment for reviewing a toy was given to us, I could not think of going with anything else but Cozmo <3.

Cozmo is a very popular toy that had massive publicity when it launched in 2016. It was covered by most tech publications (Link 1, Link 2, Link 3) and people (me included!) have been going ga-ga about it. But does the toy hold up 2 years later? Let’s find out!

First stop, the cozmo website!

Cozmo is marketed as a toy with brains and personality. The home page (Link) talks about cozmo as a an accomplice that fits into your home and the first page has multiple videos of what Cozmo can do. It’s interesting to observe that the videos focus on Cozmo interacting with and doing things instead of its design and looks. Clearly, the company is confident in Cozmo’s personality as the driver for sales.

Screenshot 2019-02-22 10.17.30.png
Screenshot 2019-02-22 10.23.45.png

The second tab on the homepage is life with Cozmo which is an interesting choice. The tech tab is 4th in the navigation menu which usually is not the case. Most companies love to show of the tech first but the makers of Cozmo decide to stick with what it can do. *Applause*

One of the interesting features of animal detection is hidden inside the tech section. They could have brought it up front to convince families that Cozmo would fit into a family perfectly.

Screenshot 2019-02-22 10.26.59.png

So, brains, personality, smarts and fun engagement for the whole family (animals included!) Does Cozmo hold up to promises? let’s find out.

My first stop is amazon where it has a great rating of 4.4! Link

Screenshot 2019-02-22 10.34.39.png

Reading through the positive and negative comments, The observations are:

  • A lot of the negative issues are because of technical glitches, unresponsive support and quality control.

  • A few parents had an issue of how it was tethered to a mobile phone and they did not want their kids to be glued to a smartphone. This issue is going to be germane to a lot of smart devices in the future. How can we build stuff which does not need the mobile phone as a driver?

  • While most people love the personality of the toy, it gets repetitive after a while. How can developers keep building a personality of a device over extended period of times is a challenge and it will need to be addressed by the creators.

  • Cozmo has a high plateau of engagement. Some kids drop off very early and don’t see the point in the hassle of setting it up and playing with it. Whereas, the ones who have praised it seem to have stuck with it for a while.

So, clearly the personality is a hit but the intelligence seems to be quite basic. So i started to look at people who have been hacking into Cozmo and using it as a platform. And guess what! There is an ocean of such content!

Youtube is full of Cozmo hack videos Link

Someone made their own Link

The sub-reddit is extremely active with videos, support help, hacks, mods and what not! Link

In my observation, Cozmo has managed to create a small following of people who are invested in the platform but my feeling is that it has the same issues as the Kinect. People don’t see the value upfront as the out-of-the-box execution doesn’t hold up to the promises made but enthusiasts love it for the flexibility and extensibility it offers. I wonder what direction cozmo is going to take from here.

The biggest take-away for me after delving deep into the world of smart toys is:

  • With buzzwords like intelligent, smart and AI thrown around, are we setting high expectations from a toy which then fails to live up to the hype? How can we set realistic expectations?

  • Are we diluting what we mean by smartness? What is so smart about Cozmo as most of its behavior feels programmed rather than emergent?

  • How can the behavior of an AI toy feel more organic across time?

  • How can we design the out of the box experience in a way that connects with an impatient kid so that they does not give up on it within a few hours?

  • The creation of a personality is paramount to a smart device. I wonder how the designers, engineers and product development worked to create Cozmo’s unique personality. It’s a great case study and having worked on large teams, I can understand how hard it is to pull something like this off. The tight integration between multi-disciplinary teams is something that I would like to understand more of.

And while I chew on these questions, Here are a few lovely videos of Cozmo with animals


For this week’s assignment, we had to make a web dashboard for controlling a Philips Hue bulb. I was pretty stretched for time this week so I decided to keep things simple and learn the basics. Going through the tutorial was pretty self-explanatory and the code on github was pretty logical. However, I tripped up with the call-backs and it was quite confusing for me to understand. Thankfully, I read through Timothy and Atharva’s blog and their call-backs made sense to me. I created a basic UI where the Hue can be controlled by changing it’s Hue, Saturation and Brightness values through sliders and also gave the users the option to turn it on and off. I was also changing the background of the webpage to match the color of the Hue bulb.

Screenshot 2019-02-12 04.26.39.png

I was looking at changing the color of the text to a complementary color based on the currently selected value but did not find any easy way to convert HSB values to its complementary colors. Also, I had thought of an ambient mode where the digits of HH:MM:SS are converted into a RGB hex value which is then transmitted to the hue. Again, I tripped up because I could not find a reliable way to do this. Passing RGB to HSL did not match the colors. So my top questions for this week would be:

1) How do you calculate the complementary HSB value of a color through code?

2) How do you convert a rgb color to a HSB color?

Currently listening: A whiter shade of pale - Procol Harum

References used:


//IP address of Philips Hue
let IPHub = '';
let userName = 'Your user name goes here'; // My user name as per the hue developer API
let url;

let gotham;

//Variables for display of controls and labels
let checkBox;
let hueSlide;
let hueText;
let satSlide;
let satText;
let brightSlide;
let brightText;

//Variables for controlling the philips bulb and its color
let lightNum = 1;
let hueVal = 32767;
let satVal = 127;
let brightVal = 127;

//Loading fonts
function preload() {
  gotham = loadFont('assets/Gotham Book.otf');

function setup() {

  //Create canvas
  canvas = createCanvas(windowWidth, windowHeight);
  canvas.background(hueVal, satVal, brightVal);

  //Declare Hue URL
  url = "http://" + IPHub + "/api" + userName;

  //Position ON/OFF checkbox
  checkBox = createCheckbox(' IS ON/OFF', false);
  checkBox.position(canvas.width / 2 - 90, 200);

  //Position Sliders for color control
  hueSlide = createSlider(0, 65535, 32767, 100);
  hueSlide.position(canvas.width / 2 - 225, 400);'rotate', 90);

  satSlide = createSlider(0, 254, 127, 1);
  satSlide.position(canvas.width / 2 - 100, 400);'rotate', 90);

  brightSlide = createSlider(1, 255, 127, 1);
  brightSlide.position(canvas.width / 2 + 25, 400);'rotate', 90);

  //Position button for 'Ambient Mode'
  button = createButton('Ambient mode');
  button.position(canvas.width / 2 - 85, 600);


function draw() {


  if (hueSlide.value() != hueVal || satSlide.value() != satVal || brightSlide.value() != brightVal) {

    //Change color

    //Capture slider value
    hueVal = hueSlide.value();
    satVal = satSlide.value();
    brightVal = brightSlide.value();

    //Change background
    colorMode(HSB, 65535, 254, 255);
    background(65535 - hueVal, 254 - satVal, 255 - brightVal);

    //Display text
    textSize(width / 15);
    textAlign(CENTER, CENTER);
    text('Huehuehue', width / 2, 100);

    hueText = textSize(width / 60);
    hueText.text('Hue', canvas.width / 2 - 160, 500);

    satText = textSize(width / 60);
    satText.text('Sat', canvas.width / 2 - 35, 500);

    brightText = textSize(width / 60);
    brightText.text('Brightness', canvas.width / 2 + 100, 500);


function toggleLight() {

  let path = url + '/lights'
  httpDo(path, 'GET', toggleGetResponse);


function toggleGetResponse(getData) {

  let lights = JSON.parse(getData);
  lightState = lights["1"].state.on

  let body = {
    'on': !lightState
  let path = url + '/lights/' + lightNum + '/state/'
  httpDo(path, 'PUT', body, togglePutData);


function togglePutData(putData) {

  var response = JSON.stringify(putData);
  if (response.includes("success")) {
    lightState = !lightState

function changeLightColour() {
  var path = url + '/lights/' + lightNum + '/state';
  var body = {
    'bri': 255 - brightSlide.value(),
    'sat': 254 - satSlide.value(),
    'hue': 65535 - hueSlide.value()
  var path = url + '/lights/' + lightNum + '/state/'
  httpDo(path, 'PUT', body, changeColourResponse);

function changeColourResponse() {
  console.log('Colors changed!');

function changeBG() {
  let hr = hour();
  let mn = minute();
  let sc = second();


Disobedient electronics

The theme for our second assignment was to create an object that exemplifies the ethos of disobedient electronics. I teamed up with Winnie Yoe and in our first discussion, we decided to make a few learning objectives for ourselves. Our initial list was: 1) Learn how to use ESP32 2) Learn how to fetch and display real time data 3) Use data to work with a mundane regular object that we see day to day.

Initially, we looked at the NYC open data sets and we found some interesting data around maternal health, mental health and the drug crisis. We were interested to use the data set for the opioid crisis but we realised that none of the datasets we had was not real-time and had no granularity beyond a district zone. Working with such large data-sets was proving to be challenging and we gave up on the approach.

During the discussion, we started talking about how mundane objects are basically fronts for corporations inside our homes in the name of ‘Smartness‘. That struck a chord and we refined the idea into a simple ‘smart‘bulb that is free to use but it won’t light up if the latest stock market price of the company was lower than the previous day. Going through stock market prices api, we found one which was easy to use but only gave daily stock prices. We wanted one which was hourly but in the interest of time, we went ahead with the one we found to build the proof of concept. We used the ESP32 HUZZAH to control the light bulb.

The final interaction was as follows: The bulb lights up if it detects the presence of the user and then checks for the stock price of the company (*cough* Facebook *cough*) and if the price of the company was lower, it starts blinking annoyingly. The user then has to mash the ‘like‘ button which leaves gratuitous comments on social media (not prototyped) and the bulb is ready for use again. You can watch the interaction in the video below.

I was quite happy about getting the APIs to work with the chip. I realise that there are conceptual gaps in our prototype but a lot of it was pared down in the interest of time. I believe that there is enough depth in the concept to take it further and I would like to see if I can do the same project in a more refined manner later.

A piece of velvet

I grew up in a small city in India. It was hot, dry and utterly boring in a way that only small cities in 80’s India can be. I was born premature which made me pretty sick through my early years and having no brothers and sisters, I was pretty much in my own head. And a bursting imagination often needs outlets and for me, in came in the form of playing with wooden toys. My family did not have a lot of money, so LEGOs, action figures and toys were out. But as a child, who cared? A few blocks of wood, plastic and boxes and you got a castle going! And in 80’s India, no one around me had any expensive, manufactured toys. So, it wasn’t as if I felt the need to have something that wasn’t being given to me. I was pretty happy in my own head until I saw an advertisement for a GI Joe.


GI Joes were probably the first thing I ever wanted. I was entranced and I remember throwing tantrums for having them. My parents couldn’t really afford them so they would try to keep me away but being a male child in India comes with doting grandparents and uncles who would try and cater to my whims. My frustrated parents couldn’t really say anything and in a few years, I had a collection of about 50 of them.


But this is not the story of the GI Joes.


One summer, while spending the summer vacation at my maternal grandmothers place, I came upon a box which had a small piece of velvet, 2 tiny pillows and a small piece of wood. The velvet was bedraggled with aluminum milk bottle caps stuck on it, the pillows were made out of cloth and the piece of wood, was well, a piece of wood. When I asked my grandmother about it, she told me about how that was a bed for my mother’s doll. My family is one that was torn apart from the partition of India and both my grandfathers had to leave everything they knew behind to start from scratch. So, we never had a lot of money and buying a doll was impossible. But that didn’t stop my grandmother and mother. They made dolls from whatever material they could find, built a bed, a blanket and pillows. My mom grew up playing with a stuffed piece of cloth and treasured it long after she had outgrown them. On that summer afternoon, it all came rushing to me about how entitled was I to ask for an expensive piece of plastic which was way above our means but my parents still tried to do the best they could. I felt the insides of my stomach churn and I had no way to understand what I was feeling as a child but that feeling created a sense of gratitude for them trying to do the best for me in whatever way they could. The little piece of velvet became a part of my GI Joe collection. After a long, hard day of fighting, they all were put to sleep under my mom’s velvet blanket. After all, warriors need to sleep. I often wonder what they dreamt of? What would people who fought all day dream of? Do they dream of peaceful times or more war? And under the glittering, shiny blanket, would they be happy? I did not know but it was fun to imagine that.


GI Joes unleashed my imagination. Simulation video games and construction kits later shaped my intellect, thinking and unleashed my ability to make. But my mother’s piece of velvet taught my gratitude, kindness and softness. And for that, I am grateful. Growing up as a man in India, you have a lot of hard edges as a patriarchal, masculine society shapes you to be. But a piece of cloth can round you and round you out like stones in a river. Who could have guessed?


1st week in ITP is bizarre. The floor turns into a bazaar with students hopping in and out of classes and checking Albert more than Instagram. Caught in a vortex of this hurricane that sweeps through the floor, I somehow ended up in Light & Interactivity (People who dropped the class, I owe you one!). So without much ado, here’s the first assignment.

My task: To fade an LED without using linear PWM. (It’s not the first semester anymore!)

Now, the task seemed pretty deceptively simple. All, you had to do was figure out a curve pattern, figure out the equation of the curve and voila! an expressive LED. That was until I hit an issue that is apparently, an open secret. To explain further, here is the first video:

As you watch the LED fade, trace an imaginary graph of the increase in the light with your fingers. You will come to a realisation which is this:


The curve on the left is what was used to program the LED (linear PWM) but your eyes see what is essentially an exponential growth. This article does a great job explaining the issue and some good discussion can be found here.

So, it was clear that the curve needed to be compensated for in the opposite direction to create a more linear fade. I came across this article which suggested an equation for achieving the same and it felt much better.

This seemed like a good point to try out more curves. First comes the normal sine fade from Tom’s example.

Watching this go on and off, I thought it would be cool to replicate the ‘breathing‘ light on the Mac laptops of old. Turns out, that the pattern is patented (Duh!) and Lady Ada tried to reverse engineer it but did not publish the curve equation. More on that here. If you notice the wave function on the oscilloscope, it looks like a sinusoid function with the top clipped off at the peak. I assumed that I would have to do the math for it but lo and behold! The internet giveth in abundance! Someone had written a great blog on the topic and done the math. Woohoo! Its a great post which fully explains how to derive an equation from a curve using wolfram alpha. read it here. Off I went and wrote an arduino sketch with the results as below:

I am not sure if you can see the difference but a small subtle change in the graph can create perceptible differences. After having scratched the itch of doing the macbook light, I started looking at other repos on Github and came across this repo which has a sine transition as quadratic equation. The author has a great post explaining his approach in balancing the performance and the ease of use while developing the library here.

The result looks like this:

While doing these experiments, I started thinking of the motion curves that are used for defining animations, I wondered if there were of any use. Turns out, there is an old library which has converted all of Robert Penner’s iconic work with easing curves for arduino. It was written for controlling servos, but with a few tweaks, I could get it to work with LEDs:

I did not get much time with the library but on first impression, its extremely easy to use it for any motion with an Arduino control BUT the light fades are not as pretty as the motion curves either because of perceptual differences or the need for modifications to be made to the library. I shall dig into this more later and report back.

Currently listening: Lucy in the sky with diamonds- The Beatles

Week 1: Just another basic server.

For this week’s assignment, we were asked to create a simple http server using node.js and express. Both these terms were completely new to me and I decided to keep my ambitions in check and build something that works instead of the glorious failures of ICM and P.Comp in the semester past.

For starters, I familiarized myself with node and express with Dan Shiffman’s videos. (Link)

The ‘Programming A to Z’ website also has some great explanations on working with node.js and express (Link)

For my assignment, a combination of hanging out with small bots in ‘Hacking smart toys for AI learning‘ and listening to Leonard Nimoy narrating Ray Bradbury’s ‘There will come soft rains‘, I decided to make a web server to control a bot in the following ways:

  • Make the robot move ahead. (/forward)

  • Make the robot move behind. (/back)

  • Make the robot turn left or right. (/turn/[:left or :right])

  • Make the robot dance. (/happydance)

The code was pretty uneventful except the part of constantly having to turn the server on and off. Another part which tripped me over was that home I was encountering an error when I was trying “My network IP“ :8080 instead of localhost:8080. It works like a charm inside ITP though. Maybe, it was happening because I was on a hotspot but I had no idea to rectify it. I would like to know more about how to identify and rectify such network issues.

/* References used4-line server example from Tom Igoe's class: Shiffman's videos from coding train:*/

//Include express

let express = require('express');

//Create a server

let server = express();

//Serve static files from public

server.use('/', express.static('pages'));

//GET parameters

server.get('/turn/:direction', turnBabyTurn);

server.get('/forward', moveForward);

server.get('/back', moveBack);

server.get('/happydance', happyDance);

//Start the server


//Functions to send response to GET requests

//robot turn

function turnBabyTurn(request, response) {    

let newTurnState = request.params.direction;    

if (newTurnState == 'left') {        

response.send('The robot makes a sharp turn to the ' + newTurnState);} 

else if (newTurnState == 'right') {        

response.send('The robot makes a sharp turn to the ' + newTurnState);} 

else {       

 response.send('Something went terribly wrong. Bots are stupid like that. try left or right?');}    

response.end();}//Move back

function moveBack(request, response) {    

response.send('The bot retreats back not knowing what’s lies behind it.');    


//Move forward

function moveForward(request, response) {   

response.send('The bot whirrs forward towards an indeterminate future.');    



function happyDance(request, response) {    

response.send('The bot spins on its own axis silent and alone.');    


Currently listening: Keep Talking-Pink Floyd

Apology as a Service (AAAS)

For our first assignment for critical objects, I teamed up Winnie Yoe to work on a critical object. The shop was shut for the week and we started talking about how we should write an apology for not doing the assignment. This led us to a further discussion of how apologies are manufactured and it’s as if they are a formula.

We went down the rabbit-hole of digging up apologies from Kevin Spacey to facebook to Uber and many others and we came up with the formula as follows:

[Inspirational title] → [Demonstrate passion] → [Play the Victim] → [Feign innocence of events] → [Cautiously appreciate the victims] → [Ask for time]→ [Recognise role of company without any direct acceptance of wrong-doing]→ [Promise indeterminate actions in the future]→ [Promise that it won’t happen again]→ [salutations]→ [Actual signature].

While analysing the responses, we are realised that the formula also caters to institutional anxieties and are about protecting the organisation rather than the aggrieved ones. We came up with the idea of a service for CEOs in the future which is a voice driven interface for generating apologies. We name it Bernays after Edward Bernays, the father of modern PR quite extensively documented in The century of the self.

The hypothetical device sits on the desk of the CEO and talks to him/her about the current issue and uses advanced AI to understand the situation. It asks the CEO for ‘uncomputable‘ information which helps it create a more nuanced approach to a situation and generates an apology and a strategy for handling the situation.

You can scroll through the UI below. A sister post on the project can be found here.