RIEM News LogoRIEM News

Articles tagged with "power-consumption"

  • India’s TCS gets TPG to fund half of $2B AI data center project

    Tata Consultancy Services (TCS) has partnered with private equity firm TPG to secure $1 billion funding for the first half of a $2 billion multi-year project called “HyperVault,” aimed at building a network of gigawatt-scale, liquid-cooled, high-density AI data centers across India. This initiative addresses the country’s significant gap between its large data generation—nearly 20% of global data—and its limited data center capacity, which currently accounts for only about 3% of the global total. The new data centers will support advanced AI workloads and are designed to meet the growing demand for AI compute power amid rapid adoption of AI technologies in India. However, the project faces challenges related to resource constraints, including water scarcity, power supply, and land availability, especially in urban hubs like Mumbai, Bengaluru, and Chennai where data center concentration is high. Liquid cooling, while necessary for managing the heat from power-intensive AI GPUs, raises concerns about water usage, with estimates suggesting a

    energydata-centersAI-infrastructureliquid-coolingpower-consumptionwater-scarcitycloud-computing
  • World’s largest 62-mile ‘God particle’ collider plan shelved in China

    China’s plan to build the Circular Electron Positron Collider (CEPC), a proposed 62-mile particle collider designed to study the Higgs boson with unprecedented precision, has been effectively stalled after it was excluded from the country’s upcoming five-year plan. Despite completing its full technical design reports by October 2025 and receiving positive international reviews, the multibillion-dollar project led by the Institute of High Energy Physics (IHEP) in Beijing will not proceed immediately. The CEPC project, estimated to cost around US$5.1 billion and involving thousands of scientists globally, is now on hold as China explores other large science initiatives for 2026-2030. The CEPC team plans to resubmit the proposal in 2030 but may abandon the domestic project if Europe’s competing Future Circular Collider (FCC) gains approval first. The FCC, with a slightly smaller 56-mile ring but a significantly larger budget of US$18.4 billion, is expected to have its future

    energyparticle-colliderhigh-precision-detectorenergy-resolutionpower-consumptionscientific-researchphysics-innovation
  • Meta partners up with Arm to scale AI efforts

    Meta has partnered with semiconductor design company Arm to enhance its AI systems amid a significant infrastructure expansion. The collaboration will see Meta’s ranking and recommendation systems transition to Arm’s technology, leveraging Arm’s strengths in low-power, efficient AI deployments. Meta’s head of infrastructure, Santosh Janardhan, emphasized that this partnership aims to scale AI innovation to over 3 billion users. Arm CEO Rene Haas highlighted the focus on performance-per-watt efficiency as critical for the next era of AI. This multi-year partnership coincides with Meta’s massive investments in AI infrastructure, including projects like “Prometheus,” a data center expected to deliver multiple gigawatts of power by 2027 in Ohio, and “Hyperion,” a 2,250-acre data center campus in Louisiana projected to provide 5 gigawatts of computational power by 2030. Unlike other recent AI infrastructure deals, Meta and Arm are not exchanging ownership stakes or physical infrastructure. This contrasts with Nvidia’s extensive investments in AI firms such

    energyAI-infrastructuredata-centerssemiconductorpower-consumptioncloud-computingMeta
  • What’s behind the massive AI data center headlines?

    The article discusses the recent surge in massive AI data center investments in Silicon Valley, driven primarily by the needs of OpenAI and its partners. Nvidia announced significant infrastructure commitments, while OpenAI revealed plans to expand capacity through collaborations with Oracle and Softbank, adding gigawatts of new power to support future versions of ChatGPT. These individual deals are enormous, but collectively they highlight Silicon Valley’s intense efforts to provide OpenAI with the computational resources required to train and operate increasingly powerful AI models. OpenAI also introduced a new AI feature called Pulse, which operates independently of the ChatGPT app and is currently available only to its $200-per-month Pro subscribers due to limited server capacity. The company aims to expand such features to a broader user base but is constrained by the availability of AI data centers. The article raises the question of whether the hundreds of billions of dollars being invested in AI infrastructure to support OpenAI’s ambitions are justified by the value of features like Pulse. The piece also alludes to broader

    energydata-centersAI-infrastructurepower-consumptioncloud-computingserver-capacitySilicon-Valley-investments
  • Why the Oracle-OpenAI deal caught Wall Street by surprise

    The recent surprise deal between OpenAI and Oracle caught Wall Street off guard but underscores Oracle’s continuing significance in AI infrastructure despite its legacy status. OpenAI’s willingness to commit substantial funds—reportedly around $60 billion annually for compute and custom AI chip development—signals its aggressive scaling strategy and desire to diversify infrastructure providers to mitigate risk. Industry experts highlight that OpenAI is assembling a comprehensive global AI supercomputing foundation, which could give it a competitive edge. Oracle’s involvement, while unexpected to some given its perceived diminished role compared to cloud giants like Google, Microsoft, and AWS, is explained by its proven capabilities in delivering large-scale, high-performance infrastructure, including supporting TikTok’s U.S. operations. However, key details about the deal remain unclear, particularly regarding how OpenAI will finance and power its massive compute needs. The company is burning through billions annually despite growing revenues from ChatGPT and other products, raising questions about sustainability. Energy sourcing is a critical concern since data centers are projected to

    energyAI-infrastructurecloud-computingsupercomputingdata-centerspower-consumptionOpenAI
  • Building green lasers that last: A story of patents and persistence

    The article "Building green lasers that last: A story of patents and persistence" explores the complex engineering challenges behind developing reliable green laser distance meters, despite their clear advantages over traditional red lasers. Green lasers offer significantly better visibility in bright outdoor conditions, making them highly desirable for construction, surveying, and industrial applications. However, the transition from red to green lasers is far from straightforward due to increased power consumption, heat generation, and the lower sensitivity of photodetectors to green light. These factors result in shorter battery life, thermal instability, reduced measurement range, and accuracy issues, especially under harsh outdoor lighting. Beyond the physical and optical challenges, manufacturing green laser modules at scale presents additional hurdles. Green laser components are more difficult and costly to produce consistently, with small variances causing significant performance differences between units. The article emphasizes that engineering a green laser distance meter involves balancing conflicting demands—boosting power to improve range and accuracy increases heat and safety risks, while reducing power compromises performance. Success requires a

    materialsenergy-efficiencylaser-technologygreen-laserspower-consumptionheat-managementoptical-engineering
  • ChatGPT: Everything you need to know about the AI-powered chatbot

    ChatGPT, OpenAI’s AI-powered text-generating chatbot, has rapidly grown since its launch to reach 300 million weekly active users. In 2024, OpenAI made significant strides with new generative AI offerings and the highly anticipated launch of its OpenAI platform, despite facing internal executive departures and legal challenges related to copyright infringement and its shift toward a for-profit model. As of 2025, OpenAI is contending with perceptions of losing ground in the AI race, while working to strengthen ties with Washington and secure one of the largest funding rounds in history. Recent updates in 2025 include OpenAI’s strategic use of Google’s AI chips alongside Nvidia GPUs to power its products, marking a diversification in hardware. A new MIT study raised concerns that ChatGPT usage may impair critical thinking by showing reduced brain engagement compared to traditional writing methods. The ChatGPT iOS app saw 29.6 million downloads in the past month, highlighting its massive popularity. OpenAI also launched o3

    energyartificial-intelligenceOpenAIGPUsAI-chipspower-consumptionmachine-learning
  • US accelerator slashes power use by 80%, boosts beam brightness by 100x

    energymaterialsaccelerator-technologybeam-brightnesspower-consumptionpermanent-magnetsresearch-innovation
  • Japan's new magnetic memory cuts power usage by 35% at record speed

    energymemory-technologymagnetic-memorypower-consumptionSOT-MRAMenergy-efficiencyintegrated-circuits
  • OpenAI’s planned data center in Abu Dhabi would be bigger than Monaco

    energydata-centerAI-infrastructurepower-consumptionAbu-DhabiOpenAIG42
  • Siêu máy tính 200.000 GPU của Elon Musk

    energyGPUsupercomputerAITeslapower-consumptionenvironmental-impact