Learning Groovy and Self-publishing

What is Groovy and why should I care?

Hello again, it's me, Adam. Earlier this year, I finished my self-published book, Learning Groovy, which is about, well, learning Groovy. It also covers the top Groovy-based tools and frameworks, Gradle, Grails, Spock, and Ratpack.

I've enjoyed using Leanpub as a place to work on my books (What's new in Java 8 and others). It is really easy and developer friendly. It uses a Dropbox folder and you can write your book in Markdown (which I did). I've enjoyed a fairly constant trickle of purchases, but I was frustrated that I never had enough time to devote to the other huge part of self-publishing: marketing. To be really successful with a book, it needs to be marketed really well. You need to put in a lot of time and money. So, when it came to publishing "Learning Groovy," I approached several publishers to do the marketing for me.

Luckily, one of them accepted, and I'm currently in the process of final edits (publisher shall remain anonymous for now).

This means that you can only get the self-published version of "Learning Groovy" for a limited time. Once it goes to the publisher, I have to take down all my versions per the contract.

"What is Groovy and why should I care?" you ask? First of all, what rock have you been living under? Secondly, Groovy is a mature and flexible open-source language that runs on the JVM. Want to learn more about functional programming, want optional dynamic typing, easy restful services, easy reactive web applications (Ratpack)? Maybe you to learn about the most popular build framework and testing frameworks for Java (Gradle and Spock)? Groovy is where it's at.

Views: 169

Comment

You need to be a member of Codetown to add comments!

Join Codetown

Happy 10th year, JCertif!

Notes

Welcome to Codetown!

Codetown is a social network. It's got blogs, forums, groups, personal pages and more! You might think of Codetown as a funky camper van with lots of compartments for your stuff and a great multimedia system, too! Best of all, Codetown has room for all of your friends.

When you create a profile for yourself you get a personal page automatically. That's where you can be creative and do your own thing. People who want to get to know you will click on your name or picture and…
Continue

Created by Michael Levin Dec 18, 2008 at 6:56pm. Last updated by Michael Levin May 4, 2018.

Looking for Jobs or Staff?

Check out the Codetown Jobs group.

 

Enjoy the site? Support Codetown with your donation.



InfoQ Reading List

Cactus v1: Cross-Platform LLM Inference on Mobile with Zero Latency and Full Privacy

Cactus, a Y Combinator-backed startup, enables local AI inference to mobile phones, wearables, and other low-power devices through cross-platform, energy-efficient kernels and a native runtime. It delivers sub-50ms time-to-first-token for on-device inference, eliminates network latency, and defaults to complete privacy.

By Sergio De Simone

Presentation: Ecologies and Economics of Language AI in Practice

Jade Abbott discusses the shift from massive, resource-heavy models to "Little LMs" that prioritize efficiency and cultural sustainability. She explains how techniques like LoRA, quantization, and GRPO allow for high performance with less compute. By sharing the "Ubuntu Punk" philosophy, she shares how to move beyond extractive data practices toward human-centric, sustainable AI systems.

By Jade Abbott

Python Workers Redux: Wasm Snapshots and Native uv Tooling

Cloudflare's latest advancements in Python Workers revolutionize serverless performance with near-instant cold starts, expanded package compatibility, and streamlined workflows via the uv package manager. By leveraging memory snapshots and WebAssembly, Cloudflare drastically reduces startup times, making Python a prime choice for AI and data science applications.

By Steef-Jan Wiggers

Nuxt Introduces Native Request Cancellation and Async Handler Extraction for Performance Gains

Nuxt 4.2 elevates the developer experience with native abort control for data fetching, improved error handling, and experimental TypeScript support. With a 39% reduction in bundle sizes and a streamlined app directory, this release enhances performance and project organization, positioning Nuxt as a leading choice for full-stack web applications built on Vue.js.

By Daniel Curtis

OpenAI and Anthropic Donate AGENTS.md and Model Context Protocol to New Agentic AI Foundation

OpenAI and Anthropic have donated their AGENTS.md and Model Context Protocol projects to the Agentic AI Foundation (AAIF), a new directed fund under the Linux Foundation. Block contributed their agent framework, goose, as another founding project, and several other tech companies have joined as Platinum members.

By Anthony Alford

© 2025   Created by Michael Levin.   Powered by

Badges  |  Report an Issue  |  Terms of Service