Information

Wikipedia Place

We've had a unique opportunity to speak with Brion Vibber recently at the October Orlandojug meeting. Let's discuss what we learned for those who couldn't attend and to expand on what we heard. We'll discuss technical and socio-cultural topics.

Members: 9
Latest Activity: Oct 27, 2011

About Wikipedia Place


Wikipedia is arguably the most popular and high volume site on the web. Brion Vibber has been the tech guy since the start. He described the initial architecture on Swampcast as just a couple of LAMP servers in Tampa. Since then the architecture has evolved and many lessons have been learned.

There are also workflow, ancillary sites, the community aspect and many other aspects of Wikipedia we can learn from and even influence in the future.

Let's talk about it here, where we all can ask questions and the discussions will have persistance.
(book photo from FromOldBooks)

Discussion Forum

Wikipedia Architecture - High Level View

The Wikipedia website began as a couple of LAMP servers. Brion described the wiki he chose and other details in the…Continue

Tags: vibber, performance, LAMP, swampcast, architecture

Started by Michael Levin Nov 2, 2009.

Wikipedia Place Reading List

Loading… Loading feed

Comment Wall

Comment

You need to be a member of Wikipedia Place to add comments!

 

Members (9)

 
 
 

Happy 10th year, JCertif!

Notes

Welcome to Codetown!

Codetown is a social network. It's got blogs, forums, groups, personal pages and more! You might think of Codetown as a funky camper van with lots of compartments for your stuff and a great multimedia system, too! Best of all, Codetown has room for all of your friends.

When you create a profile for yourself you get a personal page automatically. That's where you can be creative and do your own thing. People who want to get to know you will click on your name or picture and…
Continue

Created by Michael Levin Dec 18, 2008 at 6:56pm. Last updated by Michael Levin May 4, 2018.

Looking for Jobs or Staff?

Check out the Codetown Jobs group.

 

Enjoy the site? Support Codetown with your donation.



InfoQ Reading List

Tailwind CSS 4.2 Ships Webpack Plugin, New Palettes and Logical Property Utilities

Tailwind CSS version 4.2.0, released on February 18, 2026, includes a webpack plugin for streamlined integration and four new color palettes. It expands logical property utilities and improves recompilation speed by 3.8x. This update is particularly beneficial for teams on existing projects and those developing multilingual applications.

By Daniel Curtis

Cloudflare and ETH Zurich Outline Approaches for AI-Driven Cache Optimization

Cloudflare and ETH Zurich highlight how AI-driven crawler traffic challenges traditional caching in CDNs and databases. They propose AI-aware strategies including separate cache tiers, adaptive algorithms, and pay-per-crawl models to balance performance for human users and AI services while maintaining cache efficiency and system stability.

By Leela Kumili

GitHub Actions Custom Runner Images Reach General Availability

GitHub has just announced the availability of custom images for its hosted runners. They've finally left the public preview phase that started back in October behind them. This feature will enable teams to use a GitHub-approved base image and then construct a virtual machine image that really meets their workflow requirements.

By Claudio Masolo

Presentation: Local First – How To Build Software Which Still Works After the Acquihire

Alex Good discusses the fragility of modern cloud-dependent apps and shares a roadmap for "local-first" software. By leveraging a Git-like DAG structure and Automerge, he explains how to move from brittle client-server models to resilient systems where data lives on-device. He explores technical implementation, rich-text merging, and how this infrastructure simplifies engineering workflows.

By Alex Good

Article: Stateful Continuation for AI Agents: Why Transport Layers Now Matter

Agent workflows make transport a first-order concern. Multi-turn, tool-heavy loops amplify overhead that is negligible in single-turn LLM use. Stateful continuation cuts overhead dramatically. Caching context server-side can reduce client-sent data by 80%+ and improve execution time by 15–29% .

By Anirudh Mendiratta

© 2026   Created by Michael Levin.   Powered by

Badges  |  Report an Issue  |  Terms of Service