Codetown ::: a software developer's community
I don't clearly catch the difference betwenn these two concept. Someone told me that the essential différence is that the cloud computing give you a large space of storage and the grig give more advantages than storage, we can profit to much power with this last.
Does any one know more clearly these two concept; and tell us?
Tags:
I don't claim to be the expert, but the difference is (I think) in use.
Grid represents a scalable framework. You write your algorithm and your code and use as much computing power as you wallet can afford. (Useful as some work can be highly parallelizable) .
Cloud computing offers storage (true) but it's also represents the applications as well. Ideally with cloud computing, you don't need to have certain applications on your desktop - as long as you can hit the cloud, you can get, update, and use your data.
Thanks thomas;
What I got :
Grid - much computing power and can be highly parallelizable
Cloud - Storage and dont need to have certain applications on your desktop ( that's just like server application?)
Someone can tell us more?
I think if you look at the history, you will understand some difference.
In my own experience, the grid began with Oracle using it as a type of metadatabase, which would point to multiple databases residing on different but uniform hardware systems. So if a company had multiple unix boxes and needed to increase the size of their database, instead of purchasing additional hardware they could implement the grid database and combine their multiple unix servers into one database resource.
Cloud is much more in terms of it offering not only a database, but also an entire server including the operating system.
The cloud exposes an operating system, whereas a grid exposes a database.
But I am no buzz word expert so I might be wrong.
I just talked to a buddy about this, essentially the Oracle Grid product is differant because it runs the DB in memory. So access times are a lot quicker. I don't think it is really a matter of Vs. so much as Grid computing is a way to handle db transactions in a faster way.
He said their grid servers had something like 72gbs of ram. Freaking crazy
Please Bradley, wha do you think about Jackie's reaction?
Codetown is a social network. It's got blogs, forums, groups, personal pages and more! You might think of Codetown as a funky camper van with lots of compartments for your stuff and a great multimedia system, too! Best of all, Codetown has room for all of your friends.
Created by Michael Levin Dec 18, 2008 at 6:56pm. Last updated by Michael Levin May 4, 2018.
Check out the Codetown Jobs group.

Netflix’s Kasia Trapszo discusses the transition from writing code to scaling organizations. She shares lessons on building trust through technical clarity, aligning teams to solve the "right" problems, and using intentional documentation to scale your judgment. Learn how to move beyond individual output to create a lasting architectural legacy that empowers others to make better decisions.
By Kasia Trapszo
GitHub has announced the general availability of secret scanning support through its MCP Server, extending automated credential detection and remediation capabilities into AI-assisted and agent-driven development workflows.
By Craig Risi
AdonisJS version 7 introduces end-to-end type safety and reworked starter kits, alongside improved documentation. The release includes 45+ updated packages and three new ones for OpenTelemetry, typed content. It requires Node.js 24, allowing the use of native APIs. The framework emphasizes a convention-over-configuration approach while offering tools for routing, ORM, and authentication.
By Daniel Curtis
Every time-series database makes a set of storage design decisions: how to lay out rows, when to compress, what to partition on. These decisions determine cost and query performance more than the choice of database itself. This article works through those fundamentals from first principles, using widely available tools like PostgreSQL and Apache Parquet to make each trade-off measurable.
By Nirmesh Khandelwal
Two recent Linux kernel vulnerabilities have been disclosed: Copy Fail (CVE-2026-31431) on April 29, 2026, and Dirty Frag (CVE-2026-43284 and CVE-2026-43500) on May 7, 2026. Both allow local users to gain root access, affecting multiple Linux distributions. These vulnerabilities exploit flaws in the page cache via different subsystems, necessitating immediate patching by affected organizations.
By Matt Saunders
© 2026 Created by Michael Levin.
Powered by