RIEM News LogoRIEM News

Articles tagged with "semiconductor"

  • Meta partners up with Arm to scale AI efforts

    Meta has partnered with semiconductor design company Arm to enhance its AI systems amid a significant infrastructure expansion. The collaboration will see Meta’s ranking and recommendation systems transition to Arm’s technology, leveraging Arm’s strengths in low-power, efficient AI deployments. Meta’s head of infrastructure, Santosh Janardhan, emphasized that this partnership aims to scale AI innovation to over 3 billion users. Arm CEO Rene Haas highlighted the focus on performance-per-watt efficiency as critical for the next era of AI. This multi-year partnership coincides with Meta’s massive investments in AI infrastructure, including projects like “Prometheus,” a data center expected to deliver multiple gigawatts of power by 2027 in Ohio, and “Hyperion,” a 2,250-acre data center campus in Louisiana projected to provide 5 gigawatts of computational power by 2030. Unlike other recent AI infrastructure deals, Meta and Arm are not exchanging ownership stakes or physical infrastructure. This contrasts with Nvidia’s extensive investments in AI firms such

    energyAI-infrastructuredata-centerssemiconductorpower-consumptioncloud-computingMeta
  • 2D flash-silicon chip achieves record speed, 94% memory yield

    Researchers at Fudan University have developed the world’s first full-featured 2D flash memory chip integrated with traditional silicon CMOS technology, achieving a record operation speed and a 94.3% memory cell yield. This hybrid chip supports eight-bit instruction operations and 32-bit high-speed parallel random access, surpassing existing flash memory speeds. The innovation represents a significant breakthrough in combining ultrathin 2D semiconductor materials—just a few atoms thick—with mature silicon platforms, addressing key limitations in speed and power consumption that have hindered AI and data-intensive computing systems. The team overcame major challenges in integrating fragile 2D materials onto the uneven surfaces of conventional silicon wafers by employing flexible 2D materials and a modular, atomic-scale bonding approach. This method enables stable, high-density interconnections between the two technologies, facilitating efficient communication and paving the way for industrial-scale production. The chip has completed its tape-out phase, and plans are underway to establish a pilot production line to scale manufacturing.

    materialssemiconductorflash-memory2D-materialssilicon-chipCMOS-technologydata-storage
  • AMD to supply 6GW of compute capacity to OpenAI in chip deal worth tens of billions

    AMD has entered a multi-year chip supply agreement with OpenAI that could generate tens of billions in revenue and significantly boost AMD’s presence in the AI sector. Under the deal, AMD will provide OpenAI with 6 gigawatts of compute capacity using multiple generations of its Instinct GPUs, beginning with the Instinct MI450 GPU, which is expected to be deployed in the second half of 2026. AMD claims the MI450 will outperform comparable Nvidia GPUs through hardware and software enhancements developed with OpenAI’s collaboration. Currently, OpenAI already uses AMD’s MI355X and MI300X GPUs for AI inference tasks due to their high memory capacity and bandwidth. In addition to supplying chips, AMD has granted OpenAI the option to purchase up to 160 million shares of AMD stock, representing a 10% stake. The stock vesting is tied to the deployment milestones of the compute capacity and AMD’s stock price, with the final tranche vesting if AMD shares reach $600. Following the

    energyAI-computeGPUsdata-centerschip-supplysemiconductorAI-infrastructure
  • A year after filing to IPO, still-private Cerebras Systems raises $1.1B

    Cerebras Systems, a Silicon Valley-based AI hardware company and competitor to Nvidia, raised $1.1 billion in a Series G funding round that values the company at $8.1 billion. This latest round, co-led by Fidelity and Atreides Management with participation from Tiger Global and others, brings Cerebras’ total funding to nearly $2 billion since its 2015 founding. The company specializes in AI chips, hardware systems, and cloud services, and has experienced rapid growth driven by its AI inference services launched in August 2024, which enable AI models to generate outputs. To support this growth, Cerebras opened five new data centers in 2025 across the U.S., with plans for further expansion in Montreal and Europe. Originally, Cerebras had filed for an IPO in September 2024 but faced regulatory delays due to a $335 million investment from Abu Dhabi-based G42, triggering a review by the Committee on Foreign Investment in the United States (CFIUS).

    AI-hardwaresemiconductordata-centerscloud-computingAI-inferencetechnology-fundingSilicon-Valley-startups
  • The Trump admin is going after semiconductor imports

    The Trump administration is reportedly considering a new policy aimed at boosting U.S. semiconductor production by enforcing a 1:1 manufacturing ratio. Under this approach, U.S. semiconductor companies would be required to produce domestically the same number of chips as their customers import from overseas. Companies failing to meet this ratio could face tariffs, although the timeline for achieving this target remains unclear. This move follows President Trump's discussions since August about imposing tariffs on the semiconductor industry to encourage reshoring of chip manufacturing. While the ratio-based policy could eventually increase domestic semiconductor output, it poses risks to the U.S. chip industry in the short term, as manufacturing capacity is currently insufficient to meet demand. Building new semiconductor fabrication plants is a complex and lengthy process, exemplified by Intel’s Ohio plant, which has been delayed multiple times and now aims to open in 2030. Meanwhile, Taiwan Semiconductor Manufacturing Company (TSMC) has announced plans to support U.S. chip production infrastructure, though details are sparse. Overall,

    materialssemiconductorchip-manufacturingtariffssupply-chainUS-manufacturingtechnology-policy
  • A timeline of the US semiconductor market in 2025

    The U.S. semiconductor market in 2025 has experienced significant developments amid geopolitical tensions and industry shifts, largely driven by the strategic importance of AI chip technology. Nvidia reported a record quarter in August, with a notable 56% year-over-year revenue growth in its data center business, underscoring its strong market position despite broader industry turmoil. Meanwhile, Intel underwent major changes: the U.S. government took an equity stake in the company’s foundry program to maintain control, and Japanese conglomerate SoftBank also acquired a strategic stake. Intel further restructured by spinning out its telecom chip business and consolidating operations to improve efficiency, including halting projects in Germany and Poland and planning workforce reductions. Political dynamics have heavily influenced the semiconductor landscape. President Donald Trump announced potential tariffs on the industry, though none had been implemented by early September, and publicly criticized Intel CEO Lip-Bu Tan amid concerns over Tan’s ties to China. Tan met with Trump to discuss Intel’s role in revitalizing U.S

    materialssemiconductorAI-chipsIntelNvidiachip-manufacturingtechnology-industry
  • Tesla Dojo: the rise and fall of Elon Musk’s AI supercomputer

    Tesla’s Dojo supercomputer, once heralded by Elon Musk as a cornerstone of the company’s AI ambitions, has been officially shut down as of August 2025. Originally designed to train Tesla’s Full Self-Driving (FSD) neural networks and support autonomous vehicle and humanoid robot development, Dojo was central to Musk’s vision of Tesla as more than just an automaker. Despite years of hype and investment, the project was abruptly ended after Tesla decided that its second-generation Dojo 2 supercluster, based on in-house D2 chips, was “an evolutionary dead end.” This decision came shortly after Tesla signed a deal to source next-generation AI6 chips from Samsung, signaling a strategic pivot away from self-reliant hardware development toward leveraging external partners for chip design. The shutdown also involved disbanding the Dojo team and the departure of key personnel, including project lead Peter Bannon and about 20 employees who left to start their own AI chip company, DensityAI

    robotAIautonomous-vehiclesTeslasupercomputerself-driving-technologysemiconductor
  • Malaysia’s SkyeChip unveils the country’s first edge AI processor

    Malaysia has introduced its first domestically developed edge AI processor, the MARS1000, created by the local chip design firm SkyeChip. The announcement was made at an industry event, marking a significant milestone in Malaysia’s growing involvement in artificial intelligence technology. This development aligns with the country's broader strategic push to enhance AI capabilities, supported by the establishment of a dedicated agency in late 2024 focused on accelerating AI adoption, creating regulatory frameworks, and addressing AI ethics. In addition to technological advancements, Malaysia is also tightening controls on AI chip exports. Following rumors that the U.S. government considered restricting AI chip exports to Malaysia and Thailand to curb smuggling to China, Malaysia’s Ministry of Investment, Trade and Industry implemented a new regulation on July 14. This rule mandates that individuals and companies notify the Malaysian government at least 30 days before exporting or transshipping U.S.-made AI chips, reflecting the country’s increasing regulatory oversight in the AI sector.

    IoTedge-AIAI-processorchip-designMalaysia-technologysemiconductorartificial-intelligence
  • The Trump administration’s big Intel investment comes from already awarded grants

    The Trump administration announced an $8.9 billion investment in Intel, which the company described as government funds previously awarded but not yet disbursed, rather than new funding. This amount includes $5.7 billion from the Biden administration’s CHIPS Act and $3.2 billion from the Secure Enclave program. Despite President Trump’s claim that the U.S. paid nothing for these shares and his characterization of the deal as beneficial for both America and Intel, the funds are essentially government grants being converted into equity. Trump has been critical of the CHIPS Act and urged House Speaker Mike Johnson to repeal it. Intel had already received $2.2 billion in CHIPS Act funding and requested an additional $850 million reimbursement that had not yet been paid. Some legal experts question whether the CHIPS Act permits the government to convert grants into equity, suggesting potential legal challenges to the deal. Trump also previously accused Intel CEO Lip-Bu Tan of conflicts of interest, though he later praised Tan for negotiating

    materialssemiconductorchip-manufacturingCHIPS-ActIntelgovernment-grantstechnology-investment
  • U.S. government is reportedly in discussions to take stake in Intel

    The U.S. government, under the Trump administration, is reportedly in talks to acquire a stake in semiconductor company Intel. This potential investment aims to support Intel’s expansion of its manufacturing capabilities within the United States. The discussions follow concerns raised by Republican Senator Tom Cotton regarding Intel board member Tan’s alleged connections to China, which prompted scrutiny from the administration. These developments come shortly after President Trump took unspecified actions related to Intel, possibly influenced by perceived conflicts of interest. A meeting between Intel and government officials intended to address these concerns reportedly led to the idea of the government taking a direct ownership position in the company. Further details from Intel have not been disclosed, and the situation remains evolving.

    materialssemiconductormanufacturingIntelU.S.-governmenttechnologychip-industry
  • Apple announces $100B American Manufacturing program - The Robot Report

    Apple has announced a $100 billion American Manufacturing Program (AMP), expanding its total U.S. investment to $600 billion over the next four years. The initiative aims to increase domestic production of critical components and advanced manufacturing for Apple products, while incentivizing global partners to manufacture more in the U.S. Apple plans to hire 20,000 workers primarily in research and development, silicon engineering, software development, and AI/machine learning. Initial AMP partners include Corning, Coherent, GlobalWafers America, Applied Materials, Texas Instruments, Samsung, GlobalFoundries, Amkor, and Broadcom. Key projects under AMP include a major expansion of Apple’s partnership with Corning to produce advanced smartphone glass in Harrodsburg, Kentucky, and the opening of an Apple-Corning Innovation Center there. Apple also renewed a multiyear agreement with Coherent to produce VCSEL lasers in Sherman, Texas, and committed to sourcing American-made rare earth magnets from MP Materials, which will also establish

    materialsmanufacturingsemiconductorsiliconrare-earth-magnetsadvanced-glasssupply-chain
  • 2D InSe wafer outperforms silicon in mobility, switching, leakage

    Chinese scientists have achieved a major breakthrough by fabricating the world’s first wafer-scale, two-dimensional indium selenide (InSe) semiconductor chip, which outperforms silicon in key performance metrics. Using a novel “solid–liquid–solid” growth method, the team led by Professor Liu Kaihui at Peking University produced a 2-inch InSe wafer with exceptional crystal quality, phase purity, and thickness uniformity. The resulting InSe-based transistors demonstrated electron mobility up to 287 cm²/V·s, ultra-low subthreshold swings, minimal leakage at sub-10nm gate lengths, high on/off ratios, and energy-delay products surpassing the 2037 International Roadmap for Devices and Systems (IRDS) benchmarks. This advancement overcomes longstanding challenges in synthesizing large-area InSe due to vapor pressure differences and phase instability, by maintaining a perfect atomic ratio of indium and selenium during growth. The process is compatible with existing CMOS technology, facilitating potential real

    materialssemiconductorindium-selenide2D-materialswafer-scale-growthtransistor-technologynext-generation-chips
  • Female-founded semiconductor AI startup SixSense raises $8.5M

    SixSense, a Singapore-based deep tech startup founded in 2018 by engineers Akanksha Jagwani (CTO) and Avni Agarwal (CEO), has developed an AI-powered platform that enables semiconductor manufacturers to predict and detect chip defects in real time on production lines. The startup recently raised $8.5 million in a Series A funding round led by Peak XV’s Surge, bringing its total funding to approximately $12 million. SixSense addresses a critical challenge in semiconductor manufacturing by converting vast amounts of raw production data—such as defect images and equipment signals—into actionable insights that help factories prevent quality issues and improve yield. The platform is designed for process engineers rather than data scientists, allowing them to fine-tune models and deploy solutions quickly without coding. Despite the semiconductor industry's reputation for precision, inspection processes remain largely manual and fragmented, with existing systems primarily displaying data without deep analysis. SixSense’s AI platform offers early warnings, root cause analysis, and failure prediction, enabling manufacturers to act

    semiconductorAImanufacturingdefect-detectionautomationquality-controldeep-tech
  • Kleiner Perkins-backed Ambiq pops on IPO debut

    Ambiq Micro, a 15-year-old company specializing in energy-efficient chips for wearable and medical devices, made a strong debut on the public market with its IPO on July 30, 2025. The stock closed at $38.53 per share, a 61% increase from its initial $24 IPO price, valuing the company at approximately $656 million, up significantly from its $450 million private valuation in 2023. Ambiq positions itself to benefit from AI-driven growth by offering low-energy edge processors capable of integrating more intelligence and AI functionalities, as highlighted by CTO Scott Hanson. Financially, Ambiq reported a net loss of $8.3 million on $15.7 million in revenue for Q1 2025, showing a slight improvement compared to the same period in 2024. The company’s largest outside investors include Kleiner Perkins and Singapore’s state-backed EDB Investments. Notably, Kleiner Perkins partner Wen Hsieh has been a long-term

    energyenergy-efficient-chipsAI-edge-processorswearable-devicesmedical-devicessemiconductorIPO-technology
  • Tesla confirms $16.5 billion Samsung deal for next-gen chip supply

    Samsung Electronics has secured a $16.5 billion semiconductor supply deal with Tesla to produce next-generation AI chips, confirmed by both Samsung’s regulatory filing and Elon Musk’s social media announcement. The contract, effective from July 26, 2024, through December 31, 2033, involves Samsung’s new Texas semiconductor fabrication plant dedicated to manufacturing Tesla’s AI6 chips. Musk highlighted the strategic importance of this partnership, noting that Samsung currently produces AI4 chips while TSMC handles AI5 chips, with Tesla collaborating closely with Samsung to optimize manufacturing efficiency. Although Samsung has kept full contract details confidential to protect trade secrets, the deal’s scale and duration underscore its significance. This agreement represents a major boost for Samsung’s foundry business, which has been striving to catch up with competitors like TSMC in the rapidly growing AI chip market. Samsung is advancing its semiconductor technology, including plans for mass production of 2-nanometer chips that offer improved speed and energy efficiency—technology expected to

    energymaterialssemiconductorAI-chipsTeslaSamsungmanufacturing
  • Semiconductor, EV autonomy testing becomes more efficient with Nigel AI

    Emerson has developed Nigel AI Advisor, an AI-powered tool designed to enhance the efficiency and effectiveness of engineering innovations, particularly in complex test and measurement applications across industries such as semiconductors, transportation, and electronics. Integrated into Emerson’s flagship NI LabVIEW and NI TestStand software, Nigel leverages advanced large language models trained specifically on NI software to provide engineers with contextual advice, automation assistance, and detailed recommendations for improving code and test execution. The tool allows users to interact via natural language prompts, delivering precise engineering-format responses like tables and graphs, thereby enabling faster and more informed decision-making while safeguarding user data on a secure cloud platform. Nigel AI Advisor is tailored to test application development, distinguishing it from general-purpose AI assistants by being built on decades of trusted test knowledge and data. It can answer questions about programming and automation concepts, help users develop complex automated sequences, and even modify and execute test runs through interaction with the TestStand API. First unveiled at the NI Connect conference, Nigel represents

    robotautomationAIsemiconductortestingengineeringsoftware
  • Smuggled NVIDIA chips flood China despite US export crackdown

    A Financial Times investigation reveals that despite the U.S. government's export controls introduced in April 2025 banning NVIDIA’s China-specific H20 AI chips, over $1 billion worth of smuggled NVIDIA B200 and other restricted chips have flooded the Chinese market. These chips are openly sold on Chinese social media platforms like Douyin and Xiaohongshu, often alongside other high-end NVIDIA products, and are purchased by local data center suppliers serving major AI firms. The black market emerged rapidly after the export ban, with sellers even promising access to next-generation B300 chips ahead of official launches. NVIDIA maintains that it does not sell restricted chips to Chinese customers and does not support unauthorized deployments, emphasizing that datacenters require official service and support. CEO Jensen Huang has downplayed the extent of chip diversion and criticized export controls as ineffective, arguing they may accelerate China’s independent AI hardware development, potentially undermining U.S. leadership. The U.S. government is pressuring allies like Singapore, where arrests

    semiconductorAI-chipsNVIDIAexport-controlsblack-marketdata-centerschip-smuggling
  • China's bug-inspired tech to detect missiles 20,000x faster than US

    Chinese scientists have developed a novel infrared sensor inspired by the fire beetle’s heat-sensing organ, which can detect missiles and heat sources up to 20,000 times faster than existing technologies. Created by researchers at the Shanghai Institute of Technical Physics and Tongji University, the sensor uses materials like palladium diselenide and pentacene to operate in the mid-infrared range, enabling it to detect extremely low heat levels even in challenging environments such as smoke, dust, or fog. Tested in simulated wildfire conditions, the sensor demonstrated nearly 95% accuracy in tracking flame movement and storing heat patterns, highlighting its potential for applications in night vision, fire detection, industrial safety, and defense surveillance. In addition, a related device using black phosphorus and indium selenide achieved photonic memory speeds of 0.5 microseconds, allowing precise real-time data capture and image recognition. This advancement could enhance military systems, including missile defense units like China’s HQ-17AE, by enabling

    materialsinfrared-sensormissile-detectionbiomimicrysurveillance-technologysemiconductordefense-technology
  • Germany creates first-ever hybrid alloy for next-gen quantum chips

    Researchers in Germany have developed a groundbreaking hybrid semiconductor alloy composed of carbon, silicon, germanium, and tin (CSiGeSn), marking the first stable material of its kind. Created by teams at Forschungszentrum Jülich and the Leibniz Institute for Innovative Microelectronics, this new compound belongs to Group IV of the periodic table, ensuring full compatibility with existing CMOS chip manufacturing processes. The addition of carbon to the silicon-germanium-tin matrix enables unprecedented control over the band gap, a key factor influencing electronic and photonic properties, potentially allowing innovations such as room-temperature lasers and efficient thermoelectric devices. This advancement overcomes previous challenges in combining these four elements due to differences in atomic size and bonding behavior, achieved through an advanced chemical vapor deposition (CVD) technique. The resulting material maintains the delicate crystal lattice structure essential for chip fabrication and is visually indistinguishable from conventional wafers. The team successfully demonstrated the first light-emitting diode (LED) based on a quantum well

    materialssemiconductorquantum-computingalloysiliconphotonicsmicroelectronics
  • Nvidia is set to resume China chip sales after months of regulatory whiplash

    Nvidia has announced it is filing applications to resume sales of its H20 artificial intelligence chips to China after several months of regulatory uncertainty. The H20 chip, designed for AI inference tasks rather than training new models, is currently the most powerful AI processor Nvidia can legally export to China under U.S. export controls. Alongside the H20, Nvidia is introducing a new “RTX Pro” chip tailored specifically for the Chinese market, which the company says complies fully with regulations and is suited for digital manufacturing applications like smart factories and logistics. The regulatory back-and-forth began in April when the Trump administration imposed restrictions on sales of high-performance chips, including the H20, potentially costing Nvidia $15 to $16 billion in revenue from Chinese customers. However, after Nvidia CEO Jensen Huang attended a high-profile dinner at Mar-a-Lago and pledged increased U.S. investments and jobs, the administration paused the ban. This episode highlights the ongoing tension between U.S. national security concerns aimed at limiting China’s

    materialssemiconductorAI-chipsNvidiaChina-tech-marketexport-controlsdigital-manufacturing
  • A clever glass trick fixes the decade-old photonic crystal laser problem

    Engineers at the University of Illinois Urbana-Champaign (UIUC) have solved a decade-old challenge in photonic-crystal surface-emitting lasers (PCSELs) by replacing the traditionally used fragile air holes in the photonic crystal layer with embedded silicon dioxide, a solid dielectric material. This innovation prevents the collapse of the photonic crystal structure during semiconductor regrowth, a problem that previously hindered PCSEL development. Despite silicon dioxide being amorphous and difficult for semiconductor growth, the team successfully grew semiconductor layers laterally around the dielectric and merged them via coalescence, enabling the first demonstration of a room-temperature, eye-safe, photopumped PCSEL. This breakthrough creates a more stable, precise, and scalable PCSEL technology capable of producing high-brightness, narrow, circular laser beams suitable for applications such as LiDAR, optical communication, autonomous vehicle sensors, and defense systems. The use of solid dielectric material also simplifies fabrication and enhances device durability. However, the current design requires

    materialsphotonic-crystalsilicon-dioxidelaser-technologysemiconductorPCSELoptical-communication
  • Nvidia reportedly plans to release new AI chip designed for China

    Nvidia is reportedly planning to release a new AI chip tailored specifically for the Chinese market, aiming to navigate around U.S. export restrictions on advanced semiconductor technology. The chip, expected as early as September, will be based on Nvidia’s Blackwell RTX Pro 6000 processor but modified to comply with current regulations. Notably, these China-specific chips will exclude high-bandwidth memory and NVLink, Nvidia’s proprietary high-speed communication interface, which are key features in its more advanced AI chips. This move reflects Nvidia’s determination to maintain its presence and sales in China despite tightening export controls. Nvidia CEO Jensen Huang recently indicated a potential impact on the company’s revenue and profit forecasts due to these restrictions, though this new product launch might mitigate some of those effects. Additional details from Nvidia were not provided at the time of reporting.

    materialsAI-chipsemiconductorNvidiatechnologyprocessorhardware
  • European quantum scientists flip excitons like light switches

    Researchers from the University of Innsbruck, in collaboration with universities in Dortmund, Bayreuth, and Linz, have developed a novel technique to control dark excitons in semiconductor quantum dots using chirped laser pulses and magnetic fields. Excitons are quasiparticles formed when an electron is excited to a higher energy state, leaving behind a positively charged hole; the electron and hole pair orbit each other due to Coulomb attraction. Excitons are categorized as bright or dark based on their interaction with light: bright excitons can absorb or emit photons, while dark excitons, likely due to differing spin configurations, do not interact optically and thus have longer lifetimes, making them promising for energy storage and quantum information applications. The team demonstrated the ability to switch bright excitons into dark excitons and vice versa, effectively using dark excitons as a quantum memory by storing quantum states in a non-radiative form and reactivating them later with laser pulses. This controlled manipulation opens new avenues

    materialsquantum-dotsexcitonssemiconductorenergy-storageoptoelectronicsquantum-entanglement
  • Nvidia becomes first $4 trillion company as AI demand explodes

    Nvidia has become the first publicly traded company to reach a $4 trillion market capitalization, driven by soaring demand for its AI chips. The semiconductor giant's stock surged to a record $164 per share, marking a rapid valuation increase from $1 trillion in June 2023 to $4 trillion in just over a year—faster than tech giants Apple and Microsoft, which have also surpassed $3 trillion valuations. Nvidia now holds the largest weight in the S&P 500 at 7.3%, surpassing Apple and Microsoft, and its market value exceeds the combined stock markets of Canada and Mexico as well as all publicly listed UK companies. This historic rise is fueled by the global tech industry's race to develop advanced AI models, all heavily reliant on Nvidia’s high-performance chips. Major players like Microsoft, Meta, Google, Amazon, and OpenAI depend on Nvidia hardware for AI training and inference tasks. The launch of Nvidia’s next-generation Blackwell chips, designed for massive AI workloads, has intensified

    robotAI-chipsautonomous-systemsNvidiasemiconductordata-centersartificial-intelligence
  • US chipmakers could see bigger tax credits if Trump’s spending bill passes

    The Trump administration’s current spending bill, known as the “Big, Beautiful Bill,” includes a provision that could significantly increase tax credits for semiconductor manufacturers building plants in the U.S. The bill, which has already passed the Senate, proposes raising the tax credit from 25% to 35%. This enhanced credit aims to incentivize companies like Intel, TSMC, and Micron Technology to expand their domestic manufacturing capabilities. This potential tax boost comes at a critical time for the semiconductor industry, which has faced challenges due to recent export restrictions on advanced AI chips to China. The increased tax credit could help offset some of the difficulties caused by these trade limitations and support the growth of U.S.-based chip production. However, the final impact depends on whether the spending bill passes in its current form.

    materialssemiconductorchip-manufacturingtax-creditsUS-manufacturingtechnology-industryIntel
  • World’s first semiconductor made by quantum tech stuns chip industry

    Researchers at Australia’s Commonwealth Science and Industrial Research Organization (CSIRO) have unveiled the world’s first semiconductor fabricated using quantum machine learning (QML) techniques, marking a significant breakthrough in semiconductor design. Their approach, centered on a Quantum Kernel-Aligned Regressor (QKAR), outperformed seven classical machine learning (CML) algorithms traditionally used in this field. The team focused on modeling the Ohmic contact resistance—a critical yet challenging parameter that measures electrical resistance at the metal-semiconductor interface—using data from 159 experimental samples of gallium nitride high electron mobility transistors (GaN HEMTs), which offer superior performance compared to silicon-based semiconductors. The QKAR architecture converts classical data into quantum data using five qubits, enabling efficient feature extraction through a quantum kernel alignment layer. This quantum-processed information is then analyzed by classical algorithms to identify key fabrication parameters and optimize the semiconductor manufacturing process. By intelligently reducing the problem’s dimensionality, the researchers ensured compatibility

    semiconductorquantum-technologyquantum-machine-learningmaterials-sciencechip-designgallium-nitridehigh-electron-mobility-transistor
  • Reversible computing can help reclaim your chip's wasted energy

    The article discusses the significant energy inefficiency in modern AI hardware, where nearly all electrical energy consumed by processors is lost as heat due to fundamental limitations in conventional CMOS transistor technology. This inefficiency is especially critical as generative AI models like ChatGPT demand substantially more power per query compared to traditional searches, contributing to data centers potentially consuming up to 12% of US electricity by 2030. The root cause lies in abrupt transistor switching in CMOS chips, which dissipates energy as heat and imposes costly cooling requirements and scalability challenges. Vaire Computing, a startup based in the US and UK, proposes a solution through reversible computing using adiabatic switching. This approach gradually transfers electrical charge during transistor switching, significantly reducing energy loss by preserving and recycling information rather than erasing it, thereby circumventing Landauer’s principle that links information deletion to heat generation. Vaire’s prototypes currently reclaim about 50% of wasted computational energy, with expectations for even greater efficiency improvements. This innovation could mark a

    energysemiconductorreversible-computingchip-efficiencyAI-hardwareadiabatic-switchingdata-centers
  • Intel hits the brakes on its automotive business, and layoffs have started

    Intel is shutting down its automotive architecture business and laying off most of its staff as part of a broader company restructuring aimed at refocusing on its core client and data center segments. The decision was communicated internally on June 25, 2025, with Intel emphasizing a commitment to a smooth transition for customers. While the automotive division was not a major revenue driver, it had been active in automated vehicle technology and software-defined vehicles, investing heavily since around 2015, including the $15.3 billion acquisition of Mobileye in 2017, which later became a publicly traded company with Intel as a major shareholder. Despite showcasing new AI-enhanced system-on-chip (SoC) technology for vehicles at CES 2025 and the Shanghai Auto Show earlier this year, the automotive business’s future appeared uncertain amid broader company challenges. New CEO Lip-Bu Tan had already warned of layoffs due to falling sales and a bleak outlook. The wind-down follows Intel’s recent announcement of layoffs in its Foundry division

    robotautonomous-vehiclesautomotive-technologyAIsemiconductorsoftware-defined-vehiclesIntel
  • Perovskite image sensor triples light capture, sharpens resolution

    Researchers at ETH Zurich and Empa in Switzerland have developed a novel perovskite-based image sensor that significantly outperforms traditional silicon sensors in light sensitivity, resolution, and color accuracy. Unlike conventional sensors that rely on color filters—resulting in substantial light loss by capturing only about one-third of incoming photons per pixel—the new sensor uses stacked layers of lead halide perovskite crystals. Each layer is chemically tuned to absorb a specific wavelength (red, green, or blue) without filters, enabling each pixel to capture the full spectrum of light. This design allows the sensor to capture up to three times more light and achieve three times greater spatial resolution than current silicon-based sensors. The perovskite sensor’s tunability comes from adjusting the chemical composition of the crystals, specifically the ratios of iodine, bromine, and chlorine ions, to target different colors. This approach not only enhances image clarity and color precision but also reduces digital artifacts. The researchers have successfully miniaturized the technology

    materialsperovskiteimage-sensorlight-capturesemiconductormachine-visiondigital-photography
  • US labs build low-cost gallium nitride chips for next-gen radars

    Researchers at MIT and partner institutions have developed a novel, low-cost fabrication process that integrates high-performance gallium nitride (GaN) transistors onto standard silicon CMOS chips. This breakthrough addresses previous challenges related to GaN’s high cost and specialized integration needs by using a scalable method compatible with existing semiconductor manufacturing. The process involves creating many tiny GaN transistor "dielets," which are bonded onto silicon chips using a low-temperature copper-to-copper bonding technique. This approach maintains material functionality, reduces system temperature, and significantly enhances performance while keeping costs low. The team demonstrated the effectiveness of this hybrid chip technology by building a power amplifier that outperformed traditional silicon-based devices in signal strength and efficiency, indicating potential improvements in wireless communication such as better call quality, increased bandwidth, and longer battery life. The integration method avoids expensive materials and high temperatures, making it compatible with standard semiconductor foundries and promising broad applicability in commercial electronics. Additionally, the researchers suggest that this technology could support quantum computing applications due to

    materialsgallium-nitridesemiconductorCMOSchip-fabricationpower-electronicsradar-systems
  • China’s AI system builds Intel-class chips with zero US software

    China has developed an AI-powered chip design system called QiMeng, created by the Chinese Academy of Sciences and affiliated institutions, to accelerate semiconductor development and reduce reliance on Western software amid escalating US-China tech tensions. QiMeng uses large language models to automate complex chip design tasks, significantly shortening development times—for example, producing an autonomous-driving chip in days instead of weeks. The platform is structured in three layers, integrating processor models, design agents, and chip design applications to support automated front-end design, hardware description language generation, OS configuration, and compiler toolchain creation. Researchers have already built two processors with QiMeng: QiMeng-CPU-v1, comparable to Intel’s 486, and QiMeng-CPU-v2, similar to Arm’s Cortex A53. The launch of QiMeng directly responds to US export restrictions that limit Chinese access to leading electronic design automation (EDA) software from companies like Synopsys, Cadence, and Siemens EDA, which previously dominated China’s EDA market. By open-sourcing QiMeng and publishing detailed documentation, China aims to improve design efficiency, reduce costs, and enable rapid customization of chip architectures and software stacks. While China still faces challenges in fabrication technology and ecosystem diversity, QiMeng represents a strategic step toward automating the full chip design and verification process and advancing China’s broader goal of semiconductor self-reliance in the face of ongoing geopolitical pressures.

    AIsemiconductorchip-designprocessorautomationtechnology-independenceChinese-Academy-of-Sciences
  • AMD acqui-hires the employees behind Untether AI

    energyAIsemiconductoracquisitionefficiencyroboticstechnology
  • Breakthrough: Scientists spot hidden quantum states after 60-year hunt

    materialsquantum-statessuperconductorssemiconductorenergy-scalesnanowiresvortex-states
  • New silicone glows in vibrant colors while conducting electricity

    materialssemiconductorelectrical-conductivityflexible-electronicssiliconecopolymerinnovative-materials
  • World's smallest atomic-scale semiconductor produces solar hydrogen

    semiconductorsolar-hydrogenphotocatalystquantum-materialsenergy-solutionsnanotechnologysustainable-energy
  • New DirectDrive plasma etching tech to help build ultra-precise chips

    materialssemiconductorplasma-etchingchip-manufacturingprecision-technologyelectronicsRF-energy
  • The Future of Manufacturing Might Be in Space

    materialsmanufacturingspace-technologycrystal-growthsemiconductorin-space-manufacturingaerospace
  • Phân tử mới có thể cách mạng hóa ngành sản xuất chip

    materialssemiconductororganic-moleculeselectrical-conductivitychip-productionnanotechnologyenergy-efficiency
  • Huawei aims to take on Nvidia’s H100 with new AI chip

    HuaweiAI-chipNvidiaAscend-910DsemiconductortechnologyChina