Code 13: Servo Dreams & Digital Hallucinations

Where machines finally cooperate, hands start hallucinating, and screens develop mysterious leaks

APO: Art Properly Oriented

After the A4988 debacle documented in my last post, I'm pleased to report a breakthrough on the hardware front. The servo experiment actually worked! I've attached an illustration to test rotation, creating what I'm calling APO (Art Properly Oriented).

No more awkwardly turning your head to understand a painting or illustration. The piece rotates for you, transforming the viewing experience from passive to dynamic. There's something delightfully unnecessary yet completely compelling about artwork that refuses to stay still.

Sometimes the most interesting innovations emerge from solving problems nobody actually has.

The mechanical aspects were surprisingly straightforward compared to my previous hardware struggles. Perhaps the universe decided I'd suffered enough with the A4988 and granted me this small victory as compensation.

Hand Hallucinations: Deep Dream in Motion

Moving beyond still images, my Deep Dream experimentation has expanded into video territory. The latest test features my own hand transforming into a pulsing, morphing dreamscape of algorithmic hallucinations.


There's something particularly unsettling about watching a body part—something so familiar and utilitarian—dissolve into these organic-mechanical hybrid forms. The hand, our primary tool for creating, becomes a canvas for machine vision to reinterpret and recreate.

Dimensional Shifts: TouchDesigner 3D Experiments

The boy has officially started experimenting with TouchDesigner's 3D capabilities. After weeks of working with 2D filters and effects, stepping into the third dimension feels like discovering an entirely new instrument within an already complex orchestra.

What began as simple geometric explorations quickly evolved into experiments with lighting, texture, and perspective. There's a particular joy in watching flat concepts take on volume and depth, especially when you can manipulate them in real-time.

I'm still very much at the beginning of this 3D journey—more stumbling than dancing—but the potential for merging these techniques with my existing work feels enormous. Particularly exciting is the possibility of creating environments for my illustrated characters to inhabit.

SPADE-COCO: Machine Vision Still Life

Trying to make basic images with SPADE-COCO has been an exercise in both fascination and frustration. For those unfamiliar, SPADE-COCO allows for semantic image synthesis—essentially generating photorealistic images from segmentation maps.

My latest experiment was a still life, which prompts the question: Can you identify the objects in the generated composition? The algorithm's interpretation of common objects sometimes veers into the uncanny valley—almost right but subtly wrong in ways that are difficult to articulate.

What I find most compelling about these experiments is the window they provide into machine perception. The AI doesn't "see" objects as we do—it recognizes patterns and relationships that approximate our understanding of physical things, but with its own peculiar logic and limitations.

Glamorous Failures: The Smoke & Screen Saga

Not all experiments lead to success, of course. My ambitious attempt to pair a smoke machine with a projector screen failed spectacularly. The idea was to create three-dimensional projection spaces using smoke as a semi-solid surface—a concept that remains compelling in theory but proved challenging in execution.




 I'll share more details in future posts, but the short version involves uneven smoke distribution, unexpected air currents, and the discovery that my screen appears to be leaking. Yes, leaking. I'm still not entirely sure what that means, but there's definitely something unparallel about it.


I'll share more details in future posts, but the short version involves uneven smoke distribution, unexpected air currents, and the discovery that my screen appears to be leaking. Yes, leaking. I'm still not entirely sure what that means, but there's definitely something unparallel about it.

The Conversation Continues

These experiments, both successful and failed, continue the dialogue between human creativity and technological tools. What fascinates me most is how quickly the line blurs between intentional creation and happy accident, between human direction and machine suggestion.

Whether it's a servo motor finally cooperating, a hand transformed by neural networks, or a screen developing its own peculiar character through technical flaws, each interaction opens new possibilities for expression and understanding.

Until next time, keep experimenting. Remember that in the space between what you intended and what actually happened, there's often something far more interesting than either.