Codetown ::: a software developer's community
Part 1 here: https://codetown.com/group/kotlin/forum/topics/kotlin-thursdays-kot...
Welcome, all to another week of Kotlin Thursdays. In this week we are going to dive deeper into Kotlin Koans and like all koans, this one is going to get more difficult. This week we are going to cover default arguments, lambdas, strings and data classes. These koans are a great way to get into functional programming and learn about the kotlin syntax.
Within default arguments, we are going to you will see how kotlin can take declare an argument at the beginning of the function. Using this notation at the beginning of the function for some makes the code easier to read and support. Having the declarations at the top also reduces the lines of code so there is less sifting through lines. I learned how to do this type of declarations earlier and I always preferred that style.
Lambdas are still confusing to me. My first introduction into lambdas was playing with them on Amazon Web Services. I then saw that lambdas popped up in Java 7 and 8. I’m glad I can see them again here. I was a little confused about the “it” convention which confused me. When I read through the function from right to left the use of ‘it” makes perfect sense.
Strings glorious strings yes I sing this out loud often. This koan teaches us about string literals and string templates and how to use them. I weird but for some reason, this koan makes me happy. I think when I started down my Kotlin journey this is where things started making sense to me.
The last koan we explore is the data class. Within the data class koan we see some the readability of Kotlin shine. We are given a class in java and then rewriting the class in kotlin and as you might have guess writing in Kotlin is cleaner.
I hope you enjoy the Kotlin Thursdays episode!
For this walkthrough, you will need to install the EduTools plugin into IntelliJ!
https://www.jetbrains.com/help/education/install-edutools-plugin.html?section=IntelliJ%20IDEA
Here is another overview of what we are doing -
https://www.jetbrains.com/help/education/learner-start-guide.html?s...
Think of these resources as supplemental if you happen to be more curious. We always encourage looking into documentation for things you use!
Tags:
Super! Can’t wait to work through it!
Codetown is a social network. It's got blogs, forums, groups, personal pages and more! You might think of Codetown as a funky camper van with lots of compartments for your stuff and a great multimedia system, too! Best of all, Codetown has room for all of your friends.
Created by Michael Levin Dec 18, 2008 at 6:56pm. Last updated by Michael Levin May 4, 2018.
Check out the Codetown Jobs group.

During QCon SF, Jeremy Edberg and Qian Li from DBOS presented a non-conventional architectural approach to workflow orchestration: treating PostgreSQL not just as a data store, but as the orchestration layer itself. Their talk addressed a persistent problem in distributed systems: workflows frequently fail, recovery mechanisms are complex, and visibility into workflow state remains challenging.
By Eran Stiller
AI-generated code is “highly functional but systematically lacking in architectural judgment”, a new report from Ox Security has found. In a report released in late October called Army of Juniors: The AI Code Security Crisis, AI application security (AppSec) company Ox Security outlined 10 architecture and security anti-patterns that are commonly found in AI-generated code.
By Patrick Farry
CameraX 1.5 introduces support for capturing slow-motion and high frame-rate videos as well as unprocessed, uncompressed still images. These capabilities are enabled by the new SessionConfig API, which streamlines camera setup and configuration.
By Sergio De Simone
At QCon SF 2025, Dr. Nicole Forsgren highlighted how AI accelerates code generation but reveals deployment bottlenecks, urging a strategic pivot to optimizing Developer Experience (DevEx). With 31% of developer time lost to friction, focusing on effective feedback loops, flow state, and cognitive load management is vital for competitive survival and retention.
By Steef-Jan Wiggers
IBM recently announced the Granite 4.0 family of small language models. The model family aims to deliver faster speeds and significantly lower operational costs at acceptable accuracy vs. larger models. Granite 4.0 features a new hybrid Mamba/transformer architecture that largely reduces memory requirements, enabling Granite to run on significantly cheaper GPUs and at significantly reduced costs.
By Bruno Couriol
© 2025 Created by Michael Levin.
Powered by