software engineering

Tech Optimizer
December 25, 2025
Bernardo Quintero sought to find the programmer behind the Virus Málaga, which had a significant impact on his life and career in cybersecurity. The virus, initially a piece of malware, sparked Quintero's interest in the field and led to the creation of VirusTotal, which was acquired by Google in 2012. This acquisition helped elevate Málaga as a tech hub in Europe. Quintero revisited the virus code and discovered a clue linking it to a programmer named Enrique, who was affectionately known as Kiki. He learned from Antonio Astorga, a source who passed away, that the virus contained a hidden message against the Basque terrorist group ETA. Astorga's legacy continues through his son, Serhiy, who has aspirations in cybersecurity and quantum computing.
Winsage
December 25, 2025
A Microsoft distinguished engineer, Galen Hunt, clarified that a project aimed at rewriting parts of Microsoft's code using AI and Rust is strictly research-focused and not an official plan to phase out C and C++ from Windows by 2030. His team is developing technology for large-scale code migration between programming languages, aiming for "1 engineer, 1 month, 1 million lines of code." This project is part of Microsoft's Future of Scalable Software Engineering group and is not a roadmap for Windows 11 or future versions. Microsoft has been integrating Rust into its products, including rewriting segments of the Windows kernel in 2023, as part of its commitment to adopting memory-safe programming languages.
Winsage
December 25, 2025
A Microsoft engineer, Galen Hunt, clarified that his earlier statements about phasing out all C and C++ code by 2030 were misinterpreted. He emphasized that the initiative he discussed is a research project focused on developing technology for large-scale code migration between programming languages, not a definitive plan for Windows. The goal of the project is to enable "1 engineer, 1 month, 1 million lines of code" using AI agents and algorithmic infrastructure. Hunt's team is looking for a Principal Software Engineer with Rust experience to assist in this research. Microsoft has been integrating Rust into its products, including rewriting parts of the Windows kernel in Rust, as it aims to improve security and reduce programming errors. However, Hunt noted that Rust is not necessarily the final destination for all Microsoft code.
Winsage
December 24, 2025
Galen Hunt, a Distinguished Engineer at Microsoft, has proposed to eliminate all C and C++ code within the company, focusing on transitioning to Rust. This initiative aims to address technical debt and improve memory safety, as C and C++ are responsible for approximately 70% of vulnerabilities in Microsoft products. Microsoft plans to leverage AI and modern tooling to facilitate this transition, which includes rewriting portions of the Windows kernel in Rust. Hunt is seeking a Principal Software Engineer with Rust expertise to support this effort. The adoption of Rust is growing globally, with a reported 2.3 million developers using it, and major tech companies are increasingly integrating it into their infrastructure.
Winsage
December 24, 2025
Microsoft is planning to replace C and C++ with Rust across its codebases by 2030, as stated by engineer Galen Hunt. The company aims to eliminate every line of C and C++ using artificial intelligence and advanced algorithms, targeting a goal of “1 engineer, 1 month, 1 million lines of code.” Microsoft has developed a code processing infrastructure to support this initiative, which is already operational for various code understanding challenges. In 2023, Microsoft began rewriting parts of the Windows Kernel using Rust due to vulnerabilities associated with C and C++. The new role advertised by Hunt is part of the Future of Scalable Software Engineering group within Microsoft CoreAI, indicating a significant investment in modernizing Microsoft's code for enhanced security and efficiency.
Winsage
December 23, 2025
Microsoft plans to eliminate all C and C++ code from its products by 2030, as announced by Galen Hunt on November 25, 2025. This decision follows significant malfunctions in Windows 11 that began in July 2025, affecting core components like the Start Menu and Taskbar. The company aims to achieve "1 engineer, 1 month, 1 million lines of code" through AI-assisted rewrites. A patch to address these issues is promised for December 2025. The initiative is part of Microsoft's Future of Scalable Software Engineering group, with a focus on leveraging AI to manage and modify code at scale. A Principal Software Engineer position has been opened, emphasizing expertise in Rust. Microsoft is investing heavily in AI infrastructure, with plans to allocate billions for datacenter construction in 2025.
AppWizard
November 24, 2025
Closing arguments in the remedies trial concerning the U.S. Department of Justice's case against Google's advertising technology business concluded on November 21. A federal judge is deliberating on how to address Google's two identified monopolies in this sector, following a ruling in April 2025 that declared Google holds two illegal monopolies within the advertising technology realm. The DOJ is advocating for the divestiture of Google's ad marketplace platform, AdX, which Google argues may present significant technological challenges. Judge Leonie Brinkema has acknowledged the urgency of the situation while recognizing the complexities introduced by Google's anticipated appeals. Google has presented expert testimonies highlighting the difficulties of breaking up its ad tech business, while the DOJ argues that such a breakup is necessary for a more competitive environment. The court's decision could have broader implications for the advertising technology landscape and digital competition.
Tech Optimizer
November 5, 2025
Cloud data platform vendor Snowflake has open-sourced its PostgreSQL extensions to enhance integration with its lakehouse system. The new pg_lake extension allows developers to read and write directly to Apache Iceberg tables from PostgreSQL, streamlining data management without the need for data extraction. The PostgreSQL extensions, developed by Crunchy Data, are licensed under the Apache license. Snowflake acquired Crunchy Data for [openai_gpt model="gpt-4o-mini" prompt="Summarize the content and extract only the fact described in the text bellow. The summary shall NOT include a title, introduction and conclusion. Text: Cloud data platform vendor Snowflake has recently taken a significant step by open-sourcing its set of PostgreSQL extensions. This initiative aims to facilitate seamless integration between the widely-used open-source database and Snowflake's lakehouse system, enhancing the capabilities for developers and data engineers. Integration with Apache Iceberg With the introduction of pg_lake, developers can now read and write directly to Apache Iceberg tables from PostgreSQL. This innovation eliminates the cumbersome process of data extraction and movement, allowing users to leverage their existing PostgreSQL setups more effectively. Apache Iceberg is recognized for its open table format, which enables users to utilize their preferred analytics engines without the need to relocate data. The format enjoys backing from major players in the industry, including Snowflake, Google, and AWS. Christian Kleinerman, Snowflake's executive vice president of product, shared insights with The Register about the implications of this open-source extension. He emphasized that it empowers developers using PostgreSQL to transform their database into a management interface for an open lakehouse. The lakehouse concept, initially introduced by Databricks five years ago, serves as a unified system for managing both structured and unstructured workloads. Kleinerman elaborated on the practical applications of this integration: “One of the most common use cases for developers [will be] to build applications against PostgreSQL and then [move] or copy the data for analytics into either a data platform like Snowflake or increasingly, an open data lakehouse like Iceberg tables on S3 Tables in [AWS] or Microsoft Onelake [in Fabric]… that data now becomes available for analytics.” Development and Licensing The PostgreSQL extensions are available under the Apache license and were initially developed by Crunchy Data, a PostgreSQL specialist startup. Snowflake acquired Crunchy Data for 0 million in June of this year, further solidifying its commitment to enhancing PostgreSQL capabilities within its ecosystem. In a recent blog post, Craig Kerstiens, Snowflake's software engineering director, highlighted that pg_lake enables developers to manage Iceberg tables directly in PostgreSQL. This is achieved by introducing a new Iceberg table type where PostgreSQL serves as the catalog. Additionally, developers can query raw data files in the data lake, external Iceberg tables, Delta tables, and various geospatial file formats directly from PostgreSQL. Market Insights Robert Kramer, vice president and principal analyst at Moor Insights & Strategy, commented on the strategic significance of this development. He noted that providing PostgreSQL users with a direct pathway into Snowflake’s lakehouse and AI capabilities without necessitating architectural changes is a wise approach. “Most organizations are not ripping out PostgreSQL — and Snowflake clearly understands that. Pg_lake lowers the barrier for PostgreSQL teams to gradually adopt Snowflake for high-value analytics and automation, rather than treating it as an all-or-nothing platform decision,” he stated. Kramer anticipates that this will lead to incremental adoption and increasing traction over time, particularly as teams integrate operational databases with governed AI execution. In addition to the pg_lake announcement, Snowflake unveiled the general availability of Snowflake Intelligence, an AI agent designed to empower users to pose complex questions in natural language, thereby making insights readily accessible to every employee. Enhancements have also been made to its Horizon data catalog. However, Kramer pointed out that Snowflake may still need to address certain aspects such as scale, monitoring, and the real-world costs associated with agent workloads. He remarked, “Buyers might need some help understanding how Snowflake is different from Databricks and other cloud platforms. Snowflake is designed to be a platform where AI can work reliably and responsibly, not just for testing purposes. For customers who want to move from experimenting with AI to using it in real-world operations, this mindset is really important.”" max_tokens="3500" temperature="0.3" top_p="1.0" best_of="1" presence_penalty="0.1" frequency_penalty="frequency_penalty"] million in June. The integration enables developers to manage Iceberg tables directly in PostgreSQL and query various data formats. Analysts suggest this development will facilitate gradual adoption of Snowflake's capabilities by PostgreSQL users. Snowflake also announced the general availability of Snowflake Intelligence, an AI agent for natural language queries, alongside enhancements to its Horizon data catalog.
Tech Optimizer
October 29, 2025
A disconnect has been identified between traditional databases and the requirements of AI agents, prompting a rethinking of database architecture. Four initiatives are redefining databases for AI: 1. **AgentDB** treats databases as lightweight, disposable artifacts, allowing agents to create and discard databases easily for single tasks, catering to simple AI applications and temporary data processing needs. It is not suitable for complex transactional systems. 2. **Postgres for Agents** enhances PostgreSQL with features like zero-copy forking, enabling secure testing and experimentation without affecting live systems. It targets developers building AI applications and offers a cloud service with a free tier. 3. **Databricks Lakebase** integrates transactional capabilities within a data lakehouse architecture, providing seamless access to real-time operational data and historical insights for AI agents. It aims to unify data workloads and reduce the complexity of maintaining separate databases. 4. **Bauplan Labs** focuses on safety and reliability, developing a "programmable lakehouse" with a "Git-for-data" model that ensures verifiable and auditable data operations for AI agents. It targets high-stakes scenarios where mistakes could have significant repercussions. These initiatives reflect a broader trend of reshaping databases to cater to machines, emphasizing ephemeral, isolated, and context-aware systems.
Search