Olay Gazete Turkish Newspaper in London
  • İNGİLTERE GÜNDEMİ
  • TOPLUM
  • EKONOMİ
  • YAŞAM/SAĞLIK
  • KÜLTÜR/SANAT
  • EĞLENCE/TATİL
  • WEBTV
  • MODA/TASARIM
  • SPOR
  • İNGİLTERE GÜNDEMİ
  • TOPLUM
  • EKONOMİ
  • YAŞAM/SAĞLIK
  • KÜLTÜR/SANAT
  • EĞLENCE/TATİL
  • WEBTV
  • MODA/TASARIM
  • SPOR
No Result
View All Result
Olay Gazete Turkish Newspaper in London
  • ENFIELD
  • HACKNEY
  • HARINGEY
  • ISLINGTON
  • GÜNEY LONDRA
  • KIBRIS KKTC
Home News in English

OpenAI and Microsoft join UK’s international coalition to safeguard AI development

Melis Yahsi by Melis Yahsi
22/02/2026
in News in English
0
22 February 2026

OpenAI and Microsoft pledge funding to AI Security Institute’s Alignment Project: an international effort on AI systems that are safe, secure and under control.

  • OpenAI and Microsoft pledge new funding to AI Security Institute’s flagship Alignment Project: an international effort to work towards advanced AI systems that are safe, secure and under control
  • AI Alignment – making sure AI acts as intended – is a crucial field of AI research, building public trust in the technologies already reshaping public services and delivering new jobs
  • Additional £5.6 million OpenAI backing plus support from Microsoft and others confirmed at the AI Impact Summit means over £27 million is now available for AI alignment research, backing some 60 projects

Leading tech firms OpenAI and Microsoft are the latest to join an initiative spearheaded by the UK’s AI Security Institute (AISI) – encouraging trust and public confidence in AI as it rewires public services and drives national renewal.

Announced by Deputy Prime Minister David Lammy, and AI Minister Kanishka Narayan as the AI Impact Summit in India draws to a close today (Friday 20 February), the news bolsters the work of AISI’s Alignment Project which was first announced last summer.

Some £27 million will now be made available through the fund, supporting research efforts to ensure AI systems work as they’re supposed to, with £5.6 million coming from OpenAI, and additional support from Microsoft and others.

Cementing the UK’s position as a world leader in frontier AI research, today also sees the first Alignment Project grants awarded to 60 projects from across 8 countries, with a second round due to open this summer.

AI alignment refers to the effort of steering advanced AI systems to reliably act as we intend them to, without unintentional or harmful behaviours. It involves developing methods that prevent such unsafe behaviours as AI systems become more capable. Progress on alignment is something that will boost confidence and trust in AI, ultimately supporting the adoption of systems which are increasing productivity, slashing medical scan times for patients, and unlocking new jobs for communities up and down the country.

Without continued progress in alignment research, increasingly powerful AI models could act in ways that are difficult to anticipate or control – which could pose challenges for global safety and governance.

UK Deputy Prime Minister, David Lammy, said:

AI offers us huge opportunities, but we will always be clear-eyed on the need to ensure safety is baked into it from the outset.

We’ve built strong safety foundations which have put us in a position where we can start to realise the benefits of this technology. The support of OpenAI and Microsoft will be invaluable in continuing to progress this effort.

UK AI Minister, Kanishka Narayan, said:

We can only unlock the full power of AI if people trust it – that’s the mission driving all of us. Trust is one of the biggest barriers to AI adoption, and alignment research tackles this head-on.

With fresh backing from OpenAI and Microsoft, we’re supporting work that’s crucial to ensuring AI delivers its huge benefits safely, confidently and for everyone.

Alignment is crucial for the security of advanced AI systems and its long-term adoption across all walks of life. It is about making sure AI models operate as they should do, even as their capabilities rapidly evolve. With the rise of AI systems that can perform increasingly complex tasks, there is a growing global consensus that AI alignment is one of the most urgent technical challenges of our era.

Besides OpenAI and Microsoft, AISI’s Alignment Project is supported by an international coalition including the:

  • Canadian Institute for Advanced Research (CIFAR)
  • Australian Department of Industry, Science and Resources’ AI Safety Institute
  • Schmidt Sciences
  • Amazon Web Services (AWS)
  • Anthropic
  • AI Safety Tactical Opportunities Fund
  • Halcyon Futures
  • Safe AI Fund
  • Sympatico Ventures
  • Renaissance Philanthropy
  • UK Research and Innovation (UKRI)
  • Advanced Research and Invention Agency (ARIA)

It is led by a world-class expert advisory board, including Yoshua Bengio, Zico Kolter, Shafi Goldwasser, and Andrea Lincoln.

Mia Glaese, VP of Research at OpenAI, said:

As AI systems become more capable and more autonomous, alignment has to keep pace. The hardest problems won’t be solved by any one organisation working in isolation—we need independent teams testing different assumptions and approaches. Our support for the UK AI Security Institute’s Alignment Project complements our internal alignment work and helps strengthen a broader research ecosystem focused on keeping advanced systems reliable and controllable as they’re deployed in more open-ended settings.

As home to world-leading AI companies and research institutions, and 4 of the world’s top 10 universities, the UK is uniquely positioned to lead global efforts to build AI that we can have confidence in.

The Alignment Project builds on AISI’s international leadership, ensuring leading researchers from the UK and collaborating partners can shape the direction of the field and drive progress on safe, AI that behaves predictably.

The Project combines grant funding for research, access to compute infrastructure, and ongoing academic mentorship from AISI’s own leading scientists in the field to drive progress in alignment research.

Notes

Visit the Alignment Project website for further information.

The Alignment Project advisory board includes:

  • Yoshua Bengio, Full Professor at Université de Montréal and founder and scientific advisor of Mila – Quebec AI Institute
  • Zico Kolter, Professor and Head of Machine Learning Department at Carnegie Mellon University
  • Shafi Goldwasser, Research Director for Resilience, Simons Institute, UC Berkeley
  • Andrea Lincoln, Assistant Professor of Computer Science, Boston University
  • Buck Shlegeris, Chief Executive Officer, Redwood Research
  • Sydney Levine, Research Scientist, Google DeepMind
  • Marcelo Mattar, Assistant Professor of Psychology and Neural Science at New York University

 

source: GOV

Previous Post

Oğuz, yabancı öğrencilerin muhaceret sorunlarını dinledi

Next Post

AB’de yasaklı toksik kimyasallar, İngiltere’de hala kullanılıyor!

Related Posts

Right to Buy overhaul to safeguard social housing
News in English

Right to Buy overhaul to safeguard social housing

29/04/2026
Teenager turning 16? Don’t miss out on Child Benefit
News in English

Teenager turning 16? Don’t miss out on Child Benefit

29/04/2026
Precautionary recall of antidepressant medication
News in English

Precautionary recall of antidepressant medication

29/04/2026
Welfare reform: Speech to the IPPR by Work and Pensions Secretary
News in English

Rebuilding Britain for the new world: Liz Kendall’s speech at the Royal United Services Institute

29/04/2026
PIP kesintisinden etkilenecek kişilere sunulan geçiş ödemesi
News in English

DWP puts disabled people first

29/04/2026
Gold Steady as markets eye Fed guidance
Borsa

Gold consolidates amid conflicting signals

27/04/2026
Next Post
Fruit and veg import checks scrapped ahead of UK-EU deal

AB’de yasaklı toksik kimyasallar, İngiltere’de hala kullanılıyor!

Adres: 100 Green Lanes, Newington Green, Hackney, London, N16 9EH Telefon: 020 3745 1261
020 7923 9090
Email: info@olaygazete.co.uk
seriilanlar@olaygazete.co.uk
100 Green Lanes, Newington Green, Hackney, London, N16 9EH 020 3745 1261 - 020 7923 9090 info@olaygazete.co.uk - seriilanlar@olaygazete.co.uk
Translate:
tr Türkçe
ar العربيةen Englishde Deutschel Ελληνικάiw עִבְרִיתru Русскийtr Türkçeuk Українська
Back

Kategoriler

  • İngiltere Gündemi
  • Sağlık – Yaşam
  • Londra ve Belediyeler
  • Kültür – Sanat
  • Toplum Haberleri
  • Moda – Tasarım
  • Ekonomi
  • Olay Web Tv
  • Köşe Yazıları
  • Spor Gündemi
No Result
View All Result

T&CsTs&Cs

  • Classifieds Advertising
  • Payments

Site Links

  • Site T&Cs
  • Archives
  • Contact us
Site T&Cs - Archives - Contact us
Mobil Uygulamalar Olay Gazete Mobil Uygulamaları
Sosyal Medya
Olay Sosyal Medya
  • Site T&Cs
  • Archives
  • Contact us

© 2026 JNews - Premium WordPress news & magazine theme by Jegtheme.

No Result
View All Result
  • KATEGORİLER
  • Seri İlanlar
  • Toplum Haberleri
  • İngiltere Gündemi
  • Ekonomi
  • Ticaret
  • Spor Gündemi
  • Yaşam – Sağlık
  • Kültür – Sanat
  • Moda – Tasarım
  • Eğlence – Tatil
  • Video – WebTV
  • Köşe Yazarları

© 2026 JNews - Premium WordPress news & magazine theme by Jegtheme.