The smart glasses revolution is heating up, and Samsung doesn't want to be left behind. After months of speculation, concrete details about the company's first foray into wearable eyewear have surfaced, giving us a clearer picture of what Samsung believes the future of hands-free computing should look like. This matters because smart glasses represent the next frontier in how we interact with technology—moving beyond phones and watches to something we literally wear on our faces all day.

For Samsung, this is a strategic move to compete in a space where Meta and Google are already making serious plays. The company has been quietly developing these glasses under the codename "Jinju," and leaked images suggest a sleek, relatively conventional design that won't make you look like you're wearing a sci-fi prop. The first-generation model is expected to land sometime this year, likely with a formal announcement at Samsung's major July Unpacked event, followed by an actual release later in 2026.

Here's what the Jinju glasses will reportedly pack: a 12-megapixel camera, Qualcomm's Snapdragon AR1 chip (specifically designed for augmented reality), and bone-conduction speakers that direct sound to your ears without blocking ambient noise. Notably, this first version won't have a display—meaning you won't see overlaid information floating in your field of vision. Instead, think of it as a smart camera and audio device that works seamlessly with your phone and Google's Gemini AI assistant. The expected price range sits between $380 and $500, though rising component costs could push that higher at launch.

Samsung isn't stopping there. A second-generation model, codenamed "Haean," is already in development and will arrive in 2027 with the display feature everyone's really waiting for. That version could cost $600 to $900 and would finally bring Samsung into direct competition with devices like the Meta Ray-Ban Display glasses, which already offer augmented reality capabilities built right into the frames.

The bigger picture here is that Samsung is using Android XR—Google's new wearables platform—as the foundation for these glasses. This is crucial because it means deep integration with Google's Gemini chatbot and the broader Android ecosystem. Smart glasses are becoming less about standalone devices and more about extensions of your existing digital life, pulling in AI assistance, camera capabilities, and real-time information when you need it.

The smart glasses market is still finding its footing. Meta's Ray-Ban glasses have gained traction by focusing on practical features like video recording and AI-powered photo analysis rather than flashy displays. Google is building its own vision around Gemini integration. Samsung's two-phase approach—starting with a display-less version, then adding visual AR later—suggests the company is being cautious, testing the market before fully committing to the more expensive and complex display technology.

CuraFeed Take: Samsung's strategy reveals something important about where the industry sees smart glasses heading: they're not ready to be all-in-one AR devices yet. The first generation is essentially a premium camera and AI-powered audio companion, which is honest about current limitations. This positions Samsung to capture early adopters without betting everything on technology (micro-LED displays at this scale) that still has manufacturing challenges. The real competition isn't between Samsung, Meta, and Google right now—it's whether consumers actually want to wear computers on their faces. Samsung's phased approach hedges that bet smartly. Watch for pricing pressure: at $380-$500 for a display-less device, Samsung needs to convince people these glasses solve real problems beyond "I could use my phone instead." The 2027 display version will be the true test of whether Samsung can compete with Meta's head start in this space. If component costs keep rising, we could see all three companies struggling with margins on these devices.