Sam Iglesias pfp
Sam Iglesias
@sam
I’m going to create a thread of interesting talks from this week’s WWDC, with tiny comments. Andrew was my closest collaborator when I was doing engineering on this project, and we had a particular fondness for space-based examples. https://developer.apple.com/wwdc23/10109
12 replies
4 recasts
27 reactions

Sam Iglesias pfp
Sam Iglesias
@sam
SDK is out. Was reminded the other day that the first commit message for one of the major forks was “sam code” 🤦 https://t.co/aVVnTPevyT
0 reply
1 recast
4 reactions

Sam Iglesias pfp
Sam Iglesias
@sam
For a product so defined by its relationship to your visual field, sound has an unexpectedly heightened importance. For example, it’s far easier to get you to look at something behind you by playing a sound there than to put up an alert. https://developer.apple.com/videos/play/wwdc2023/10271/
1 reply
0 recast
2 reactions

Sam Iglesias pfp
Sam Iglesias
@sam
The fact that SwiftUI is declarative makes it really nice for cross-platform because out of the box it commits the developer to very little in the way of UI components, dimensionality, or even input.
0 reply
1 recast
1 reaction

Sam Iglesias pfp
Sam Iglesias
@sam
Million dollar question for HMDs is input. The most natural, as well as the most ambitious, is eyes and hands, which work remarkably well due to the wide field of view and optimized placement of the camera system. Eugene and Israel talk through the design implications: https://developer.apple.com/wwdc23/10073
0 reply
0 recast
2 reactions

Sam Iglesias pfp
Sam Iglesias
@sam
As mentioned in the keynote, the device uses the room’s 3D map to simulate how waves produced in different parts of space will sound (and reflect, absorb, etc), and we are remarkably attuned to it.
1 reply
0 recast
2 reactions

Sam Iglesias pfp
Sam Iglesias
@sam
Not relevant to Vision, but I have to share this one done by Mac and Chan on dynamic live activities, if anything for the sumptuous all-out animations Chan did for it in the second half. https://developer.apple.com/videos/play/wwdc2023/10194/
0 reply
0 recast
2 reactions

Sam Iglesias pfp
Sam Iglesias
@sam
In this talk, he shows off the delicate solution the team landed on to provide abstractions that are both familiar to the 2D layout paradigm of SwiftUI but also contain the power to express volumetric content with hooks into RealityKit and some lightweight 3D modifiers.
1 reply
0 recast
1 reaction

Sam Iglesias pfp
Sam Iglesias
@sam
This easter egg is Mark’s pet rock that he found in San Francisco and became one of the unofficial mascots of the team. I came up with the name Rock Mikewell for him, which is a play on our VP’s name, Mike Rockwell.
0 reply
0 recast
1 reaction

Sam Iglesias pfp
Sam Iglesias
@sam
Chan’s the designer who did swipe to go home for iPhone X as well as Dynamic Island, among many other things. He’s incredible. Mac did the Lock Screen redesign and the widget in system, and also incidentally was my designer for Tea back in 2011 when we were both starting out. They’re among the best.
0 reply
0 recast
1 reaction

Sam Iglesias pfp
Sam Iglesias
@sam
Having cameras aimed down not only allows for face tracking during conversations, it also allows the arms to rest naturally by your sides without needing them in the visual field of view, reducing fatigue.
0 reply
0 recast
0 reaction

Sam Iglesias pfp
Sam Iglesias
@sam
Here’s Mark’s talk that goes into more detail about how the Russian nesting doll of 2D-3D-2D works with SwiftUI and RealityView. The attachments modifier really powerfully allows 2D UI to show up within a 3D context. Another space app too! https://developer.apple.com/videos/play/wwdc2023/10113/
0 reply
0 recast
0 reaction

nir.eth 🌿🟣🐦☁️ pfp
nir.eth 🌿🟣🐦☁️
@nir
Cool
0 reply
0 recast
1 reaction