I’ve seen “Physical computing” described as making computers interact with the physical world (sensors and buttons, etc), but I’m curious if computing tasks themselves can be shaped into a physical form and speed that’s aligned for direct human interaction. Like lifting a block from one place to another to transfer a file. It’s purposely inefficient, but directly relatable.
I do not want to wear goggles to enter a virtual space, I want the exact opposite.

Follow

@tendigits that's called phidgets or phydgets (for "physical widgets"). Some experiments from the early 2000 here: grouplab.cpsc.ucalgary.ca/phid (with nerf guns sending emails at you, and various things interacting with Windows Live Messenger)

@pulkomandy oh fascinating. This looks like a great rabbit hole to explore, thank you!

Sign in to participate in the conversation
Mastodon Tetaneutral.net

Instance de Mastodon, réseau social de micro-blogging libre et décentralisé hébergée par l'association Tetaneutral.net.