Last night, I fell asleep under the stars, the chirp of crickets intermingling with the old radiator’s whistle off in the distance. I just finished an episode of Justified: City Primeval on the big screen. It was a constant 68 degrees, but I tucked myself into the duvet, nonetheless. For tonight, I’m thinking the surface of the moon, or perhaps the edge of a Hawaiian volcano.
According to most analytics, the average American spends around seven hours a day in front of screens. The Center for Disease Control recommends something in the neighborhood of two hours. But for all the increased focus on sleep hygiene and the harmful effects of staring at displays all day, it seems society is swiftly moving in the opposite direction.
When we refer to “screen time,” we’re largely talking about phones, computers, televisions — that sort of thing. All the while, an entirely distinct paradigm has been looming over the horizon for some years. In the case of the Vision Pro, we’re talking two screens — one per eye — with a combined 23 million pixels.
These screens are, of course, significantly smaller than the other examples, but they’re right there in front of your eyes, like a $3,500 pair of glasses. This is something I’ve been thinking about quite a bit over my first 48 hours with the Vision Pro.
In 2018, Apple introduced Screen Time as part of iOS 12. The feature is designed to alert users to their — and their children’s — device usage. The thinking goes that when presented with such stark figures at the end of each week, people will begin to rethink the way they interface with the world around them. Tomorrow, Apple is finally releasing the Vision Pro. The device is another effort to get people to rethink the way they interact with the world, albeit in entirely the opposite direction.
I’ve spent much of the past two years attempting to break myself of some of my worse pandemic habits. Toward the top of the list are all those nights I fell asleep watching some bad horror movie on my iPad. I’ve been better about this. I’m reading more and embracing the silence. That is, until this week. The moment the Vision Pro arrived, all that went out the window.
Now, there’s a certain extent to which much of this can be written off as part of my testing process. To review a product, you need to live with it as much as possible. In the case of the Vision Pro, that means living my life through the product as much as possible. I’m taking work calls on it, and using it to send emails and Slack messages. I’m listening to music through the audio pods and — as mentioned up top — using it to watch my stories.
Even my morning meditation practice has moved over to the headset. It’s that classic irony of using technology to help counteract some of the problems it introduced into our lives in the first place.
While my job requires me to use the Vision Pro as much as humanly possible while I have it, I have to assume my experience won’t be entirely dissimilar from that of most users. Again, you’re going to want to make the most of the $3,500 device as you’re able, which invariably translates to using it as much as you can.
When I wrote Day One of this journal yesterday, I cautioned users to ease into the world of Vision Pro. In a very real way, I wish I’d better heeded my own advice. By the end of my first 24 hours, the nausea started hitting me hard. Your results will, of course, vary. I’m prone to car and sea sickness myself. That patch you see behind my right ear in some of the Vision Pro photos is for the former. (It’s probably a placebo, but sometimes fooling yourself is the best medicine.)
VR sickness and car sickness actually operate in similar ways. They’re caused by a mismatch between what your eyes and inner ear are perceiving. Effectively, your brain is getting mixed signals that it’s having trouble reconciling.
In some ways, this phenomenon gets to the heart of something fundamental in mixed reality. Even in the world of passthrough AR, there’s a disconnect between what you see and what your body feels. The Vision Pro’s passthrough is the best I’ve experienced in a consumer device. The cameras capture your environment and transmit it to your eyes as quickly as possible. Using this technology, the headset can overlay computer graphics over the real world — a phenomenon Apple refers to as “spatial computing.”
This gets to something important about this brave new world. Extended reality isn’t reality. It’s the world filtered through a computer screen. Now, we spiral into an existential argument fairly quickly here.
This week I’ve been reminded of what a Samsung executive said when confronted with the fact that the company is “faking” the moon with its premium smartphones, “[T]here is no such thing as a real picture. As soon as you have sensors to capture something, you reproduce [what you’re seeing], and it doesn’t mean anything. There is no real picture. You can try to define a real picture by saying, ‘I took that picture’, but if you used AI to optimize the zoom, the autofocus, the scene – is it real? Or is it all filters? There is no real picture, full stop.”
Sorry, but I need to be a lot more stoned to have that specific conversation. For now, however, the Vision Pro is making me question how comfortable I am in a future where “screen time” largely involves having them strapped to my face. The effect is undeniably intriguing, pointing to some incredibly innovative applications in the near future (I’m sure we’ll see a number of these among the initial 600 apps).
Maybe bracing yourself for the future is a combination of embracing bleeding-edge technologies while knowing when it’s time to touch grass. That 2.5 hour-long battery pack might not be the worst thing after all.
>>> Read full article>>>
Copyright for syndicated content belongs to the linked Source : TechCrunch – https://techcrunch.com/2024/02/01/apple-vision-pro-day-two/