March saw a flurry of announcements and developments in large language models, but there was also significant news in other technologies. One notable change was the United States’ new cybersecurity strategy, which shifts responsibility from customers to software and service providers. This encourages the use of memory-safe languages like Rust, Java, and Zig to create more secure software.

In AI, many new models, tools, and services were released, including gpt4All, Fair Diffusion, Dolly, and ChatGPT’s plugin API. Stanford’s Alpaca 7B model, a more efficient clone of LLaMA 7B, showed potential for running on smaller systems. Several large companies, such as Microsoft and Google, announced plans to incorporate AI capabilities into their products. Meanwhile, concerns over watermarking and accuracy of AI-generated content continued to be debated.

In the programming world, Zig, a memory-safe language designed to compete with C, C++, and Rust, has been gaining attention. GitHub announced Copilot X, a vision for the next generation of its Copilot service with new features, including a voice interface and the ability to explain code. In security, the Evasive.AI platform generated malware samples and corresponding training data for security systems, and the US released a national cybersecurity strategy that emphasizes long-term investments and shifting responsibility to software and service providers.

Share This Story!

Related posts