Codetown ::: a software developer's community
All the tutorials and books for node.js seem to use Mongo as the database. I am not sold on 'document' databases and would like to know how difficult it is to use any version for plain old tried-and-true SQL with Node.js.
Does anybody have any experience in this area?
Tags:
One of the traditional knocks on JS is the volatility of doing sql queries from a interpreted script. Not to mention security. In other words, how do you regulate resource for results in a varying client environment. Node.js is supposed to provide a server side capability. However I would be skeptical of it its implementation of a transnational capability. A memento pattern, or ability to rollback transactions, at least until thoroughly tested. Given the fact that most discussions are coupled with no-sql db's is a clue as to what its intended usage should be. Perhaps caches for local search tools like solr. Easy to update, and rebuild, but less likely to be an efficient engine for individualized rdbms queries.
Codetown is a social network. It's got blogs, forums, groups, personal pages and more! You might think of Codetown as a funky camper van with lots of compartments for your stuff and a great multimedia system, too! Best of all, Codetown has room for all of your friends.
Created by Michael Levin Dec 18, 2008 at 6:56pm. Last updated by Michael Levin May 4, 2018.
Check out the Codetown Jobs group.

Louis Ryan shares a compelling vision for modern cloud native hybrid networking. He critiques primitive network abstractions (the "Big IP" problem) and rigid security policies that rot and cause SPOFs. Discover how architects can elevate network functionality, bake in identity (mTLS/PKI), and leverage composability to achieve repeatable policy enforcement everywhere their applications run.
By Louis Ryan
Grab updated its internal platform to monitor Apache Kafka data quality in real time. The system uses FlinkSQL and an LLM to detect syntactic and semantic errors. It currently tracks 100+ topics, preventing invalid data from reaching downstream users. This proactive strategy aligns with industry trends to treat data streams as reliable products.
By Patrick Farry
Serving Large Language Models (LLMs) at scale is complex. Modern LLMs now exceed the memory and compute capacity of a single GPU or even a single multi-GPU node. As a result, inference workloads for 70B+, 120B+ parameter models, or pipelines with large context windows, require multi-node, distributed GPU deployments.
By Claudio Masolo
Karrot replaced its legacy recommendation system with a scalable architecture that leverages various AWS services. The company sought to address challenges related to tight coupling, limited scalability, and poor reliability in its previous solution, opting instead for a distributed, event-driven architecture built on top of scalable cloud services.
By Rafal Gancarz
Sharing your work as a software engineer inspires others, invites feedback, and fosters personal growth, Suhail Patel said at QCon London. Normalizing and owning incidents builds trust, and it supports understanding the complexities. AI enables automation but needs proper guidance, context, and security guardrails.
By Ben Linders
© 2025 Created by Michael Levin.
Powered by