Codetown ::: a software developer's community
View
| Discussions Replied To (1) | Replies | Latest Activity |
|---|---|---|
"With Eclipse/Aptana I *think* I got installed, but the online references that showed…"Kevin Neelands replied Dec 5, 2012 to Best Ruby I.D.E? |
2 |
Dec 5, 2012 Reply by Kevin Neelands |
Codetown is a social network. It's got blogs, forums, groups, personal pages and more! You might think of Codetown as a funky camper van with lots of compartments for your stuff and a great multimedia system, too! Best of all, Codetown has room for all of your friends.
Created by Michael Levin Dec 18, 2008 at 6:56pm. Last updated by Michael Levin May 4, 2018.
Check out the Codetown Jobs group.

LMArena has launched Code Arena, a new evaluation platform that measures AI models' performance in building complete applications instead of just generating code snippets. It emphasizes agentic behavior, allowing models to plan, scaffold, iterate, and refine code within controlled environments that replicate actual development workflows.
By Robert Krzaczyński
Amazon Web Services has announced enhancements to its CodeBuild service, allowing teams to use Amazon ECR as a remote Docker layer cache, significantly reducing image build times in CI/CD pipelines. By leveraging ECR repositories to persist and reuse build layers across runs, organisations can skip rebuilding unchanged parts of containers and accelerate delivery.
By Craig Risi
Late projects. Architectures that drift from their original design. Code that mysteriously evolves into something nobody planned. These persistent problems in software development often stem not from technical failures, but from forces we pretend don't exist—reward systems that incentivize the wrong behaviors, organizational structures that ignore domain boundaries, and human dynamics.
By Vanessa FormicolaMarina Moore, a security researcher and the co-chair of the security and compliance TAG of CNCF, shares her concerns about the security vulnerabilities of containers. She explains where the issues originate, providing solutions and discussing alternative routes to using micro-VMs rather than containers. Additionally, she highlights the risks associated with AI inference.
By Marina Moore
Kimi released K2, a Mixture-of-Experts large language model with 32 billion activated parameters and 1.04 trillion total parameters, trained on 15.5 trillion tokens. The release introduces MuonClip, a new optimizer that builds on the Muon optimizer by adding a QK-clip technique designed to address training instability, which the team reports resulted in "zero loss spike" during pre-training.
By Vinod Goje
© 2025 Created by Michael Levin.
Powered by