Perhaps I should have post this as my first message to the group, but I will add it anyway for completeness. Or in case someone wants to try Scala out and at least you can grap this template to start pasting code to trying it out for other examples.

object Hello {
  def main(args: Array[String]): Unit = {
println("Hello world.")
}
}

Save above into Hello.scala, then compile and run your program like these:
powerbookg4:tmp zemian$ scalac Hello.scala
powerbookg4:tmp zemian$ scala Hello
Hello world.

Note that Scala main entry program is a "object" instead of "class". "object" in Scala is like a class that define a type, but it force it to be a singleton(only one instance), so it almost like "static" in Java. Your main entry in command line must be an object with the main method defined.


You may turn your source file into a script by enter a expression that invoke the main method on the end of the file, and then run it through "scala" instead of compiling it. For example:

object Hello {
  def main(args: Array[String]): Unit = {
println("Hello world.")
}
}
Hello.main(args)

Note that variable "args" is predefined when you run it as script. To run it, just invoke like this:
powerbookg4:tmp zemian$ scala Hello.scala
Hello world.

Note the difference. 1 no compile. 2 you give scala the script file name, not the type name!


Happy programming!

Views: 44

Happy 10th year, JCertif!

Notes

Welcome to Codetown!

Codetown is a social network. It's got blogs, forums, groups, personal pages and more! You might think of Codetown as a funky camper van with lots of compartments for your stuff and a great multimedia system, too! Best of all, Codetown has room for all of your friends.

When you create a profile for yourself you get a personal page automatically. That's where you can be creative and do your own thing. People who want to get to know you will click on your name or picture and…
Continue

Created by Michael Levin Dec 18, 2008 at 6:56pm. Last updated by Michael Levin May 4, 2018.

Looking for Jobs or Staff?

Check out the Codetown Jobs group.

 

Enjoy the site? Support Codetown with your donation.



InfoQ Reading List

Tailwind CSS 4.2 Ships Webpack Plugin, New Palettes and Logical Property Utilities

Tailwind CSS version 4.2.0, released on February 18, 2026, includes a webpack plugin for streamlined integration and four new color palettes. It expands logical property utilities and improves recompilation speed by 3.8x. This update is particularly beneficial for teams on existing projects and those developing multilingual applications.

By Daniel Curtis

Cloudflare and ETH Zurich Outline Approaches for AI-Driven Cache Optimization

Cloudflare and ETH Zurich highlight how AI-driven crawler traffic challenges traditional caching in CDNs and databases. They propose AI-aware strategies including separate cache tiers, adaptive algorithms, and pay-per-crawl models to balance performance for human users and AI services while maintaining cache efficiency and system stability.

By Leela Kumili

GitHub Actions Custom Runner Images Reach General Availability

GitHub has just announced the availability of custom images for its hosted runners. They've finally left the public preview phase that started back in October behind them. This feature will enable teams to use a GitHub-approved base image and then construct a virtual machine image that really meets their workflow requirements.

By Claudio Masolo

Presentation: Local First – How To Build Software Which Still Works After the Acquihire

Alex Good discusses the fragility of modern cloud-dependent apps and shares a roadmap for "local-first" software. By leveraging a Git-like DAG structure and Automerge, he explains how to move from brittle client-server models to resilient systems where data lives on-device. He explores technical implementation, rich-text merging, and how this infrastructure simplifies engineering workflows.

By Alex Good

Article: Stateful Continuation for AI Agents: Why Transport Layers Now Matter

Agent workflows make transport a first-order concern. Multi-turn, tool-heavy loops amplify overhead that is negligible in single-turn LLM use. Stateful continuation cuts overhead dramatically. Caching context server-side can reduce client-sent data by 80%+ and improve execution time by 15–29% .

By Anirudh Mendiratta

© 2026   Created by Michael Levin.   Powered by

Badges  |  Report an Issue  |  Terms of Service