RIEM News LogoRIEM News

Articles tagged with "augmented-reality"

  • Mark Zuckerberg says a future without smart glasses is ‘hard to imagine’

    Mark Zuckerberg expressed strong confidence in the future of AI-powered smart glasses during Meta’s recent earnings call, suggesting that in a few years, it will be difficult to imagine most glasses not having AI capabilities. He compared this shift to the transition from flip phones to smartphones, highlighting that billions of people already wear glasses or contacts, making smart glasses a natural next step. Zuckerberg noted that sales of Meta’s smart glasses have tripled in the past year, calling them some of the fastest-growing consumer electronics in history. Meta is actively investing in multiple smart glasses models, including Oakley-branded glasses designed for exercise. Despite Zuckerberg’s optimistic outlook, some skepticism remains given past overestimations about the metaverse’s adoption. However, the broader tech industry appears to be aligning with Meta’s vision. Google is reportedly collaborating with Warby Parker on smart glasses, Apple is developing AI glasses and related devices, and Snap is spinning off its AR glasses business. Even OpenAI is exploring AI wearables, though more focused

    IoTAI-wearablessmart-glassesaugmented-realityconsumer-electronicsMetawearable-technology
  • Snap gets serious about Specs, spins AR glasses into standalone company

    Snap is preparing to launch the latest consumer version of its augmented reality (AR) glasses, known as Specs, later this year. To enhance focus and streamline development, Snap has spun off the AR glasses division into a standalone company. This move reflects Snap’s commitment to advancing its AR hardware, which has been in development for over a decade. The current iteration, the fifth generation of Specs, runs on Snap OS—a proprietary operating system launched in September 2023—and features advanced capabilities such as four cameras for hand tracking, the Snap Spatial Engine for projecting AR imagery, and AI-powered functions like “spatial tips” that provide contextual information about the user’s environment. During a recent demo, the glasses showcased several innovative features including an improved browser, a travel mode for translating foreign text, and interactive AR games like Avatar: The Last Airbender. The devices also support synchronized experiences, allowing multiple users to share the same AR environment simultaneously, which opens possibilities for collaborative gaming and social interactions. However,

    IoTaugmented-realityAR-glasseswearable-technologySnapspatial-computingAI-integration
  • Top 7 smart glasses at CES 2026 redefining gaming, AI and productivity

    At CES 2026, several innovative smart glasses were showcased, highlighting advancements in gaming, AI integration, and productivity. The ASUS ROG Xreal R1 stands out with its world-leading 240Hz refresh rate, 1080p HDR display, and a 57-degree field of view, targeting high-end gaming experiences and expected to launch in late 2026. The Xreal 1S offers a more affordable option at $449, featuring 1200p resolution per eye, a 52-degree field of view, and real-time 2D-to-3D content conversion, supporting devices like the Nintendo Switch 2 for versatile use in work and entertainment. Other notable entries include the RayNeo Air 4 Pro, the first HDR10-enabled smart glasses priced at $299, delivering bright 1080p visuals and Bang & Olufsen-tuned audio in a lightweight design. The Rokid AI Glasses integrate AI features such as real-time translation and voice interaction with a

    IoTsmart-glassesaugmented-realityAI-integrationwearable-technologyCES-2026gaming-devices
  • After Meta Ray-Ban, Lumus debuts AR glasses with wider, thinner optics

    Lumus, an Israeli optics company known for its geometric (reflective) waveguides, unveiled two new augmented reality (AR) waveguides at CES 2026, building on its success supplying optics for the Meta Ray-Ban Display AR glasses in 2025. The new ZOE waveguide offers a wide field of view exceeding 70 degrees, the first geometric waveguide to do so, enabling immersive AR experiences such as spatial entertainment and multi-app productivity while maintaining wearability. Importantly, ZOE achieves this wide field of view using standard optical glass and existing mass-production methods, avoiding exotic materials. In addition to ZOE, Lumus introduced an optimized Z-30 optical engine with a 30-degree field of view that delivers 40% higher brightness and improved image quality, remaining compact and lightweight at 11 grams with daylight-readable displays. Lumus also previewed the next-generation Z-30 2.0 waveguide, which is 40% thinner and 30

    augmented-realityAR-glasseswaveguide-technologyoptical-materialssmart-eyeweardisplay-opticswearable-technology
  • Meta pauses international expansion of its Ray-Ban Display glasses

    Meta has announced a pause in the international expansion of its Ray-Ban Display smart glasses due to overwhelming demand and limited supply. Originally planning to launch the glasses in France, Italy, Canada, and the U.K. in early 2026, the company now intends to focus on fulfilling U.S. orders first, as waitlists for the product currently extend well into 2026. Meta is reassessing its strategy for making the glasses available outside the U.S. amid these supply constraints. At the CES event in Las Vegas, Meta showcased upcoming features for the Ray-Ban Display glasses and its Neural Band accessory. New functionalities include a teleprompter feature for delivering prepared remarks and the ability to write messages by tracing finger movements on any surface, which the Neural Band then transcribes into digital text. Additionally, pedestrian navigation support is being expanded to new cities, including Denver, Las Vegas, Portland, and Salt Lake City.

    IoTsmart-glasseswearable-technologyMetaaugmented-realityNeural-Bandpedestrian-navigation
  • Treat yourself: The best smart glasses to buy with your holiday gift money

    The article highlights the growing practicality and sophistication of smart glasses, which have evolved from futuristic gadgets into versatile tools for communication, navigation, fitness tracking, entertainment, and gaming. It presents a curated list of notable smart glasses models available for purchase, catering to various needs such as everyday wear, sports, work, and immersive gaming experiences. The article also notes upcoming product launches, indicating a rapidly expanding smart glasses market. Key models discussed include the Ray-Ban Meta Gen 2 glasses, which combine stylish design with advanced features like a 12-megapixel camera, open-ear speakers, AI voice commands, real-time translation, and up to eight hours of battery life, priced at $379. The Viture Luma Pro glasses stand out for their high-quality Sony micro-OLED display offering a 1200p image on a large virtual screen, 120 Hz refresh rate, and compatibility with multiple devices via USB-C, retailing at $499 (currently $449). Lastly, the premium Xreal

    IoTsmart-glasseswearable-technologyaugmented-realityAI-featuresbattery-lifedisplay-technology
  • Brain Gear Is the Hot New Wearable

    The article highlights the emerging trend of brain-focused wearable devices that use electroencephalography (EEG) to monitor and interpret brain waves, moving beyond traditional fitness trackers. These devices leverage AI to analyze electrical impulses from the brain for various applications, such as improving sleep quality, enhancing productivity, and enabling new forms of interaction. For example, Elemind’s $350 headband uses acoustic stimulation to promote deeper sleep by shifting brain activity to delta waves, while Neurable’s $500 EEG-equipped headphones track concentration levels and encourage breaks to optimize work efficiency. Major tech companies like Apple are also entering the neurotech space, developing EEG-sensing AirPods and integrating brain-wave control into their Vision Pro augmented reality headset, enabling users to operate devices with their thoughts via brain-computer interfaces (BCIs). Additionally, startups and nonprofits are exploring open-source neuro apps and brain-controlled games, demonstrating the potential for brainwave-based interaction in entertainment and productivity. The article also discusses medical applications of brain wearables, such

    IoTwearable-technologybrain-computer-interfaceEEG-devicesneurotechnologyaugmented-realityAI-in-healthcare
  • Meta’s AI glasses can now help you hear conversations better

    Meta has introduced a new AI-powered feature for its Ray-Ban Meta and Oakley Meta HSTN smartglasses that enhances users’ ability to hear conversations in noisy environments. This conversation-focus feature uses the glasses’ open-ear speakers to amplify the voice of the person the wearer is talking to, with adjustable amplification levels controlled by swiping the right temple or through device settings. Initially available in the U.S. and Canada, this practical update aims to improve communication in settings like busy restaurants, bars, or public transit. In addition to the conversation-focus feature, Meta is also rolling out a Spotify integration that plays music related to what the wearer is currently looking at—for example, playing songs by an artist whose album cover is in view or holiday music when looking at a Christmas tree. While this functionality is more of a novelty, it showcases Meta’s vision of linking visual context with app actions. The Spotify feature is available in English across multiple countries, including the U.S., Canada, Australia, and

    IoTsmart-glassesAIwearable-technologyaugmented-realityaudio-enhancementMeta
  • Google’s first AI glasses expected next year

    Google is set to launch its first AI-powered smart glasses in 2026, building on its partnerships with Gentle Monster and Warby Parker to develop consumer wearables running on Android XR, the same OS powering Samsung’s XR devices. These glasses aim to offer a less bulky and more stylish alternative to traditional headsets, integrating AI and extended reality (XR) seamlessly into daily life. Google is developing multiple models: one focuses on screen-free interaction using built-in speakers, microphones, and cameras to enable communication with its AI Gemini and capture photos, while another features an in-lens display visible only to the wearer, capable of showing turn-by-turn navigation and closed captioning. Additionally, Google previewed Project Aura, a wired XR glasses model from Xreal that balances between bulky headsets and minimalistic glasses. Project Aura offers extended workplace and entertainment functionalities, allowing users to access Google’s suite of products or stream video similarly to more advanced headsets. While Meta currently leads the smart glasses market, particularly through

    IoTsmart-glasseswearable-technologyAIaugmented-realityGoogleconsumer-electronics
  • Google plans 2026 debut for its first AI-powered smart glasses

    Google, in collaboration with Warby Parker, plans to launch its first AI-powered smart glasses in 2026, marking a significant reentry into the augmented reality (AR) and wearable computing market. This partnership, announced at The Android Show | XR Edition, signals Google's renewed ambition to compete with established players like Apple and Meta, who have advanced their own smart eyewear and mixed-reality devices. The glasses aim to be lightweight, stylish, and AI-enabled, designed for everyday wear, though specific details on pricing and battery life remain undisclosed. The upcoming smart glasses will leverage Google's Gemini AI model integrated with the Android XR ecosystem, enabling multimodal interactions—allowing the device to see, hear, understand context, and respond naturally. Google envisions two categories: AI glasses functioning as intelligent, screen-free assistants with speakers, microphones, and cameras, and Display AI glasses featuring in-lens displays for private, heads-up information like navigation and translations. Partnering with brands such as Samsung and Gentle

    IoTsmart-glassesAI-powered-wearablesaugmented-realityGooglewearable-technologyAndroid-XR
  • Meta reportedly delays mixed reality glasses until 2027

    Meta has delayed the release of its new mixed reality glasses, codenamed Phoenix, from the second half of 2026 to the first half of 2027. Unlike its existing smart glasses, these new devices are expected to have a form factor similar to Apple’s Vision Pro, featuring a separate puck-like power source. The delay follows internal memos seen by Business Insider, where Meta executives cited CEO Mark Zuckerberg’s directive to prioritize sustainability and higher quality user experiences. According to Meta’s metaverse leaders Gabriel Aul and Ryan Cairns, the postponement will provide additional time to refine the product details. This move aligns with Meta’s broader strategy to ensure the business model behind the glasses is viable and the technology meets higher standards before launch. The article also references a recent Bloomberg report about Meta’s plans, but the content is incomplete and does not provide further details on those plans.

    IoTmixed-realityaugmented-realitywearable-technologyMetasmart-glassesvirtual-reality
  • Photos: World’s first helmet with built-in AR visor launches with universal comms

    Shoei has introduced the GT-Air 3 Smart, the world’s first full-face motorcycle helmet with a fully integrated augmented reality (AR) visor, unveiled at EICMA. Developed in partnership with France’s EyeLights, the helmet features a nano-OLED head-up display (HUD) embedded directly into the visor, projecting critical riding information such as speed, navigation, calls, and radar alerts within the rider’s line of sight. This AR system boasts a brightness of 3,000 nits for clear visibility even in bright sunlight and is designed to improve rider response times by over 30%. The helmet maintains a sleek aerodynamic profile by housing all tech components—including the projector, battery, speakers, and noise-cancelling microphone—inside the shell, preserving comfort and airflow without external mounts. In addition to its advanced AR display, the GT-Air 3 Smart incorporates a universal intercom system compatible with all brands, supporting both cellular and offline mesh communication modes for seamless group connectivity without range limits.

    IoTaugmented-realitysmart-helmetwearable-technologyAR-visoruniversal-communicationmotorcycle-safety
  • China jumps ahead of US in race to field lighter battlefield tanks

    China has unveiled its new Type 100 main battle tank, marking a significant shift towards lighter, unmanned, and intelligence-driven armored warfare. Unlike traditional tanks, the Type 100 emphasizes advanced technological integration over heavy armor, featuring a diesel-electric hybrid engine for improved mobility and stealth. It is equipped with a 105mm main gun and an unmanned turret controlled by an advanced fire control system. The tank incorporates radars, infrared, and laser warning systems capable of detecting threats and deploying countermeasures such as interceptor rockets and jamming devices. Additionally, AI enhances situational awareness and networked firepower coordination, while a deployable reconnaissance drone provides aerial surveillance. The crew benefits from augmented reality helmets offering a 360-degree, video game-like view of the battlefield, and the tank can operate both with and without a crew. In response, the U.S. military is developing the M1E3 Abrams tank, which similarly focuses on lighter weight, hybrid electric propulsion, and data-centric defensive systems

    robotunmanned-vehicleshybrid-power-systemAI-systemsbattlefield-technologymilitary-dronesaugmented-reality
  • Nanoparticle screen hits record clarity visible to the human eye

    Researchers from Swedish institutions—including Chalmers University of Technology, the University of Gothenburg, and Uppsala University—have developed a groundbreaking display technology called retina E-paper, featuring pixels as small as 560 nanometres. This size is smaller than the wavelength of visible light, enabling a pixel density exceeding 25,000 pixels per inch (ppi), roughly 150 times denser than typical smartphone screens. The display uses tungsten oxide nanoparticles to control light scattering and produce highly accurate, tunable red, green, and blue colors. Unlike conventional LED or OLED screens, retina E-paper is reflective, relying on ambient light rather than emitting its own, which significantly reduces energy consumption and allows the screen to be positioned very close to the eye. The retina E-paper’s pixel size corresponds approximately to the size of a single photoreceptor in the human retina, meaning it achieves the maximum resolution perceivable by the human eye. The researchers demonstrated the technology by reproducing Gustav Klimt’s painting “The

    nanoparticlesdisplay-technologymaterials-scienceenergy-efficient-displaysvirtual-realityaugmented-realitytungsten-oxide-nanoparticles
  • Samsung takes on Apple’s Vision Pro with new Galaxy XR headset

    Samsung has launched its Galaxy XR headset as a direct competitor to Apple’s Vision Pro, offering a more affordable option at $1,800—nearly half the price of Apple’s device. The Galaxy XR runs on Google’s Android XR OS and Qualcomm’s Snapdragon XR2+ Gen 2 platform. It features a micro OLED display with 27 million pixels (surpassing Vision Pro’s 21 million), a resolution of 3,552 x 3,840, and a 90Hz refresh rate compared to Vision Pro’s 120Hz. Weighing 545 grams, it is lighter than Apple’s headset, which weighs between 750g and 800g. The device supports up to two hours of general use and two and a half hours of video playback, and includes multiple cameras for pass-through, world tracking, and eye tracking. Samsung emphasizes ergonomic design for comfort, with a balanced frame to reduce facial pressure. The headset supports various XR-optimized experiences such as immersive 3

    robotIoTwearable-technologyaugmented-realityvirtual-realitysmart-devicesXR-headset
  • Anduril unveils supersoldier helmets for US Army with Meta support

    Anduril Industries has unveiled EagleEye, an AI-powered modular helmet system designed to enhance battlefield awareness and command capabilities for the US Army and allied forces. EagleEye integrates mission planning, perception, and survivability into a lightweight, wearable architecture that acts as a “new teammate” for soldiers. Central to the system is a high-resolution, collaborative 3D mission planning interface that allows troops to rehearse missions and visualize terrain using live video feeds and sensor data. The helmet’s heads-up display (HUD) overlays digital information directly onto the operator’s real-world view, with versions suitable for both daytime and night operations. It also features integrated blue force tracking, providing precise teammate locations within complex environments, and connects to Anduril’s Lattice network—a distributed sensor mesh that fuses data from drones, ground vehicles, and other assets to detect threats beyond line of sight. EagleEye emphasizes protection and survivability through an ultralight ballistic and blast-resistant shell equipped with rear and side sensors for

    robotIoTmilitary-technologyAIwearable-technologysensor-networksaugmented-reality
  • Anduril’s new EagleEye MR helmet sees Palmer Luckey return to his VR roots

    Anduril Industries, a Silicon Valley defense firm co-founded by Palmer Luckey—the original creator of Oculus VR—has unveiled EagleEye, a modular mixed-reality helmet system designed to enhance soldiers with AI-augmented capabilities. Built on Anduril’s Lattice software, EagleEye integrates command-and-control tools, sensor feeds, and AI directly into a soldier’s field of vision, offering features such as live video feeds, rear- and side-sensors for threat detection, and real-time teammate tracking. The system comes in multiple variations, including a helmet, visor, and glasses, aiming to provide soldiers with enhanced situational awareness and decision-making abilities. This launch aligns with the U.S. Army’s efforts to diversify its mixed-reality gear suppliers beyond Microsoft’s troubled $22 billion IVAS program. In September, Anduril secured a $159 million contract to prototype a new mixed-reality system as part of the Soldier Borne Mission Command initiative, marking the largest effort to equip soldiers

    robotaugmented-realitymixed-realityAImilitary-technologywearable-technologysoldier-systems
  • Fundamental XR launches Fundamental Touch for wireless haptics - The Robot Report

    Fundamental XR has launched Fundamental Touch, a wireless haptics platform designed to deliver precise, untethered tactile feedback across multiple industries beyond healthcare, including robotics, industrial training, automotive, aerospace, retail, and gaming. This new software removes the traditional physical tether required by high-fidelity kinesthetic haptic devices, enabling greater user mobility and performance parity. Built on a client-server architecture, Fundamental Touch decouples haptic simulations from visual rendering and user interfaces, allowing sub-100ms latency and scalable, real-time force feedback via a peer-to-peer network layer. The system supports various output devices such as XR headsets (e.g., Apple Vision Pro, Meta Quest), robotic platforms (e.g., Boston Dynamics’ Spot), and gaming peripherals. Fundamental XR, formerly FundamentalVR, has a strong track record in healthcare, where its immersive technologies have reduced onboarding time by over 60%, improved surgical accuracy by 44%, and increased sales performance by 22%. The company has delivered

    robotwireless-hapticshuman-machine-interactionaugmented-realityvirtual-realityprecision-kinesthetic-hapticsimmersive-technology
  • Apple shelves Vision Pro overhaul to focus on AI glasses

    Apple has decided to pause its plans to overhaul the Vision Pro VR glasses in order to concentrate on developing AI-powered smart glasses that can rival Meta’s offerings. Previously, Apple was working on a cheaper and lighter version of the Vision Pro, but staff from that project are now being reassigned to focus on smart glasses development. According to Bloomberg’s Mark Gurman, Apple is working on at least two smart glasses models: the first, called N50, will connect to an iPhone and lack its own display, with a potential unveiling as early as next year and a release planned for 2027. The second model will include a built-in display and is designed to compete directly with Meta’s recently unveiled smart glasses. Although this display-equipped version was originally slated for release in 2028, Apple is accelerating its development timeline. Despite this strategic pivot, Apple remains behind Meta, which introduced its first smart glasses back in 2021. This shift highlights Apple’s intent to prioritize AI integration and smart eyew

    IoTsmart-glassesaugmented-realitywearable-technologyAppleAI-glassesconsumer-electronics
  • This Startup Wants to Put Its Brain-Computer Interface in the Apple Vision Pro

    Startup Cognixion is launching a clinical trial to integrate its noninvasive brain-computer interface (BCI) technology with Apple’s Vision Pro headset to help paralyzed individuals with speech impairments communicate using their thoughts. Unlike implant-based BCIs from companies like Neuralink, Cognixion’s system uses a custom headband equipped with six EEG sensors that detect brain signals related to visual fixation, enabling users to select options via mental attention. The trial will involve up to 10 participants in the US with speech disorders caused by conditions such as spinal cord injury, stroke, traumatic brain injury, or ALS. Cognixion’s technology combines hardware with AI-driven software that customizes communication models based on each user’s speech history and patterns, allowing for near-normal conversation speeds. Previously tested with ALS patients using their own Axon-R headset, the company now aims to leverage the broader functionality and app ecosystem of the Vision Pro to democratize access to BCI communication tools. Cognixion’s approach focuses

    robotbrain-computer-interfacewearable-technologyassistive-technologyaugmented-realityAI-communicationmedical-devices
  • Mark Zuckerberg has begun his quest to kill the smartphone

    Meta CEO Mark Zuckerberg has unveiled the Meta Ray-Ban Display, a new generation of smart glasses designed to reduce smartphone dependence and restore social presence lost to phone use. The glasses integrate with a novel Meta Neural Band wristband that uses surface electromyography (sEMG) to detect hand and brain signals, enabling users to compose text messages silently by mimicking writing gestures. Zuckerberg demonstrated texting speeds of about 30 words per minute, which is competitive with average smartphone typing speeds, marking a significant advancement over previous voice or gesture-based input methods. This innovation represents Meta’s strategic effort to capture hardware market share currently dominated by Apple and Google, reducing reliance on their app store revenues. Despite Meta Reality Labs’ history of costly projects and mixed results, the Ray-Ban Display and Neural Band showcase promising technology that could redefine user interaction by minimizing screen time and promoting more natural, discreet communication. However, it remains uncertain whether consumers will adopt this new interface over traditional smartphones, making this a high-stakes bet

    IoTsmart-glasseswearable-technologyMeta-Reality-Labsgesture-controlneural-interfaceaugmented-reality
  • Meta unveils new smart glasses with a display and wristband controller

    Meta has introduced a new pair of Ray-Ban branded smart glasses called Ray-Ban Meta Display, featuring a built-in display on the right lens for apps, alerts, and directions. The glasses are controlled via a wristband called the Meta Neural Band, which detects subtle hand gestures using electromyography (EMG) to interpret signals between the brain and hand. The Neural Band offers 18 hours of battery life and is water resistant. Priced at $800, the Ray-Ban Meta Display will be available for purchase in a few weeks, marking Meta’s latest consumer smart glasses offering aimed at enabling users to perform tasks typically done on smartphones. The Ray-Ban Meta Display builds on the success of Meta’s original Ray-Ban Meta smart glasses and includes an onboard AI assistant, cameras, speakers, and microphones. Users can access Meta apps such as Instagram, WhatsApp, and Facebook, as well as view directions and live translations through the glasses’ display. While this product offers a simpler display

    IoTsmart-glasseswearable-technologyMetaaugmented-realityAI-assistantgesture-control
  • Meta Connect 2025: What to expect and how to watch

    Meta Connect 2025, Meta’s flagship annual conference, will begin Wednesday evening with a keynote by CEO Mark Zuckerberg at the company’s Menlo Park headquarters, also available via free livestream. The event is expected to spotlight Meta’s new AI-powered smart glasses developed in partnership with Ray-Ban and Oakley. Leaks suggest the unveiling of “Hypernova” glasses featuring a heads-up display, cameras, microphones, and an AI assistant controlled by a wristband using hand gestures. Oakley’s new AI smart glasses, designed for athletes with a large unified lens and a single centered camera, are also anticipated. While Meta’s VR Quest headset lineup may not see major updates this year, the company is likely to touch on its Metaverse ambitions, though a significant new Metaverse product is expected closer to the end of 2026. This year’s Connect is particularly significant as it marks Meta’s first since launching its ambitious AI research division, MSL, headed by former Scale AI CEO Alexandr Wang

    IoTsmart-glassesAI-wearablesMeta-Connect-2025augmented-realitywearable-technologyAI-assistant
  • Anduril lands $159M Army contract for ‘superhero’ soldier headset

    Anduril Industries has secured a $159 million contract from the U.S. Army to develop a prototype helmet-mounted mixed reality system under the Soldier Borne Mission Command (SBMC) program, the successor to the Army’s earlier Integrated Visual Augmentation System (IVAS). This new system aims to provide soldiers with enhanced battlefield awareness by integrating night vision, augmented reality, artificial intelligence, and real-time intelligence overlays into a single modular platform. The goal is to enable faster decision-making and clearer situational understanding in contested environments, addressing previous IVAS issues such as user discomfort and technical delays. The SBMC system, built on Anduril’s Lattice platform and developed in partnership with companies like Meta, Qualcomm, and Palantir, offers modular hardware components tailored to mission needs and a software architecture (SBMC-A) that unifies helmet displays with edge computing and battlefield sensors. Recent field trials demonstrated capabilities such as soldiers controlling drones over three kilometers away directly from their headsets without dedicated operators.

    robotaugmented-realitymilitary-technologywearable-technologyedge-computingartificial-intelligencebattlefield-sensors
  • Inside China’s biggest military parade ever: A glimpse of future war

    China’s largest-ever military parade showcased a sweeping array of advanced weaponry, highlighting the country’s rapid modernization and push toward a networked, high-tech military. Key new systems unveiled include the QBZ-191 assault rifle, which replaces the older QBZ-95 and offers improved range, precision, and adaptability with advanced optics. On the ground, China introduced three new armored vehicles: the Type 99B main battle tank, the new Type 100 tank—potentially its first fourth-generation tank featuring active protection systems and battlefield data integration—and the Type 100 infantry fighting vehicle equipped with reconnaissance drones and augmented reality goggles for enhanced situational awareness. Additionally, China displayed the PHL-16 (PCL-191) multiple rocket launcher system, comparable to the U.S. HIMARS, capable of firing various guided rockets and tactical ballistic missiles with ranges exceeding 350 kilometers. The parade also marked the first public concentrated display of China’s nuclear triad, encompassing land-, sea-, and air-based

    robotmilitary-technologynetworked-warfareadvanced-weaponrydronesaugmented-realitydefense-systems
  • China arms tanks with AR headsets for instant 360-degree view

    China is developing augmented reality (AR) headsets for armored vehicle crews, including those operating the ZTZ-201 medium tank and new combat support vehicles. These headsets provide a 360-degree, real-time view by linking to external cameras and sensors, effectively allowing crews to "see through" the tank’s armor, which traditionally limits visibility. The AR system overlays critical battlefield data such as vehicle status, ammunition levels, and targeting information directly onto a transparent heads-up display, enhancing situational awareness in various environments and operational conditions. The modular design suggests potential deployment across multiple vehicle platforms. Beyond vision enhancement, the AR headsets integrate weapon control, enabling gunners to aim by head movement or gaze focus, similar to the U.S. Army’s Apache helicopter targeting system. This feature promises faster reaction times and reduces cognitive load by allowing more intuitive operation. The system supports role flexibility within the crew and maintains distinct functionalities for commanders and drivers. Additionally, it facilitates networked warfare by enabling real-time sharing of

    robotaugmented-realitymilitary-technologysensor-integrationbattlefield-awarenesshead-up-displayweapon-control-systems
  • Harvard dropouts to launch ‘always on’ AI smart glasses that listen and record every conversation

    Two former Harvard dropouts, AnhPhu Nguyen and Caine Ardayfio, are launching Halo X, a pair of AI-powered smart glasses that continuously listen to, record, and transcribe every conversation the wearer has. The glasses then display relevant information in real time, such as definitions or answers to complex questions, effectively enhancing the wearer’s intelligence and memory. The startup has raised $1 million in funding led by Pillar VC and plans to offer the glasses for pre-order at $249. Positioned as a potential competitor to Meta’s smart glasses, Halo X aims to provide more advanced functionality without the privacy restrictions Meta has imposed due to its poor reputation on user privacy. However, the glasses raise significant privacy concerns because, unlike Meta’s glasses which have indicator lights to alert others when recording, Halo X is designed to be discreet with no external indicators, effectively enabling covert recording. Privacy advocates warn that normalizing always-on recording devices threatens the expectation of privacy in public and private conversations, especially given that

    IoTsmart-glassesAIwearable-technologyprivacy-concernsvoice-recognitionaugmented-reality
  • Zuckerberg says people without AI glasses will be at a disadvantage in the future

    Meta CEO Mark Zuckerberg expressed a strong belief that AI-enabled glasses will become the primary interface for interacting with artificial intelligence in the future. Speaking during Meta’s second quarter earnings call, he argued that people without such AI glasses will face significant cognitive disadvantages compared to those who have them. Zuckerberg highlighted that glasses are an ideal form factor because they can allow AI to see and hear what the user experiences throughout the day and provide real-time interaction. Adding displays—whether wide holographic fields or smaller screens—will further enhance their utility. Meta has been actively developing smart glasses, such as the Ray-Ban Meta models, which have proven unexpectedly popular and generate revenue through a partnership with EssilorLuxottica. Despite Reality Labs, Meta’s division focused on these devices, operating at a financial loss, Zuckerberg views this investment as crucial for the future of AI and consumer computing. He envisions AI glasses as a key tool to blend physical and digital realities, advancing the Metaverse vision. However, the article notes that

    IoTsmart-glassesAI-wearablesaugmented-realityMeta-Reality-Labsconsumer-AI-devicesAI-interaction
  • China's cyborg battle suit gives soldiers drone-slinging superpower

    China’s Kestrel Defense has unveiled a prototype powered exoskeleton battle suit designed to enhance soldiers’ endurance, mobility, and situational awareness, particularly for drone operators, artillery units, and reconnaissance teams. The suit features mechanical leg supports to reduce fatigue during prolonged crouching or kneeling, a modular backpack housing power and control systems, and articulated shoulder arms for upper-body support. A key innovation is an integrated compact drone-launching system that enables soldiers to deploy and control small quadcopter drones in the field for short-range surveillance and reconnaissance, especially in urban environments. The soldier’s helmet includes a head-mounted augmented reality visor that can display real-time drone feeds, maps, night and thermal vision, and potentially allow interaction via gesture, eye, or voice commands. Additional digital tools such as wrist-mounted screens, health monitors, navigation aids, and encrypted communications are also integrated. Although detailed technical specifications have not been released, the suit reflects a broader global military trend toward combining robotics, wearable computing,

    robotexoskeletondronesmilitary-technologywearable-roboticsaugmented-realitydrone-control-systems
  • Meta is reportedly building AI smart glasses with Prada, too

    Meta is reportedly developing AI smart glasses in collaboration with the Italian luxury fashion brand Prada. This partnership marks a strategic move by Meta to expand its AI eyewear technology beyond its existing collaboration with EssilorLuxottica, a major eyewear conglomerate with which Meta has previously worked closely. While Prada has historically partnered with EssilorLuxottica for its eyewear production, it is not owned by the company, indicating Meta's intent to diversify its fashion partnerships. Meta has already achieved significant sales success with its Ray-Ban Meta AI smart glasses, having sold millions of units. The recent collaboration with Prada suggests Meta's ambition to integrate advanced AI features into high-end fashion eyewear, potentially broadening the appeal and market reach of its smart glasses. The article also hints at upcoming products involving other brands like Oakley, though details remain limited. Overall, Meta is positioning itself to merge cutting-edge AI technology with luxury fashion through multiple brand partnerships.

    IoTsmart-glassesAI-technologywearable-technologyMetafashion-techaugmented-reality
  • Snap plans to sell lightweight, consumer AR glasses in 2026

    Snap has announced plans to release a new pair of lightweight, consumer-focused augmented reality (AR) smart glasses called Specs in 2026. Unlike its earlier, bulkier Spectacles launched in 2016, these new glasses will be smaller, lighter, and designed for everyday public use. Specs will feature see-through lenses that project graphics into the user’s field of view and include an AI assistant capable of processing both audio and video. The glasses will leverage Snap’s SnapOS developer ecosystem, allowing millions of existing AR experiences (Lenses) from Snapchat and previous Spectacles to be compatible with the new device. The announcement comes amid growing competition in the AR glasses market from major players like Meta and Google, both of which have recently unveiled or plan to unveil their own AR products. Snap aims to differentiate itself through its robust developer platform and AI capabilities, including new features like a Depth Module API for anchoring AR graphics in 3D space and partnerships with companies like Niantic Spatial to build AI-powered world maps. However, key details such as pricing, exact design, and sales strategy for Specs remain undisclosed. While Snap is optimistic about making AR glasses practical and appealing for consumers, the market’s response and the device’s affordability will be critical to its success.

    IoTaugmented-realitysmart-glassesAI-assistantwearable-technologySnapOSAR-applications
  • New haptic tools let humans feel and guide machines in real time

    robothaptic-technologyindustrial-automationremote-controldigital-twinsaugmented-realitysafety-in-manufacturing