No images? Click here ![]()
Monthly AI Newsletter from PortNLP Lab Welcome back to explain! the PortNLP Lab's monthly newsletter. As we step into the new year, we continue our commitment to advancing responsible artificial intelligence. This month, we highlight significant developments in the AI landscape and share exciting updates from our lab. In the Knowđ° U.S. Policy Shifts on AI Regulation
In a notable policy reversal, President Donald Trump has rescinded the executive order on artificial intelligence risks previously established by the Biden administration. The original order mandated that AI developers share safety test results with the government prior to public release, aiming to mitigate potential risks to national security and public safety. The revocation has sparked a debate between promoting innovation and ensuring responsible AI development.
![]() đ AI Open Source Advancement
AI lab DeepSeek has released its new R1 model family under an open MIT license, with its largest version containing 671 billion parameters. The company claims the model performs at levels comparable to OpenAI's o1 simulated reasoning model on several math and coding benchmarks.
đ˘ Sustainability Spotlight: AI's Carbon Footprint vs. Human Effort
AI training is highly energy-intensive, with models like GPT-3 emitting
over 500 metric tons of COâ, comparable to more than 100 gasoline-powered cars
over a year.
MIT technology review
Spotlight on UsWe're kicking off 2025 with some exciting research and community engagement! Hereâs what weâve been up to this January: đŹ Fresh Off the Press
đ Cracking the code of Nepali idioms!
đ What really drives multilingual models?
đ Language Models quizzed with Linguistic Ambiguity đ¤ Silicon Forest Tech Summit 2025Prof. Agrawal delivered a lightning talk exploring the role of humans as AI evolves, reminding us that as AI continues to improve, it is the human touch that shapes the narrative. The talk emphasized the importance of responsible research and education.
Chief Editor of the Month: Sina Bagheri Nezhad Follow us on
|