Resources

Introduction

Last week, we went over higher order functions in Kotlin. We learned how higher order functions can accept functions as parameters and are also able to return functions. This week, we will take a look at lambdas. Lambdas are another type of function and they are very popular in the functional programming world.



Logic & Data

Computer programs are made up of two parts: logic and data. Usually, logic is described in functions and data is passed to those functions. The functions do things with the data, and return a result. When we write a function we would typically create a named function. As we saw last week, this is a typical named function:

fun hello(name: String): String {
return "Hello, $name"
}

Then you can call this function:

fun main() {
println(hello("Matt"))
}

Which gives us the result:

Hello, Matt

Functions as Data

There is a concept in the functional programming world where functions are treated as data. Lambdas (functions as data) can do the same thing as named functions, but with lambdas, the content of a given function can be passed directly into other functions. A lambda can also be assigned to a variable as though it were just a value.

Lambda Syntax

Lambdas are similar to named functions but lambdas do not have a name and the lambda syntax looks a little different. Whereas a function in Kotlin would look like this:

fun hello() {
return "Hello World"
}

The lambda expression would look like this:

{ "Hello World" }

Here is an example with a parameter:

fun(name: String) {
return "Hello, ${name}"
}

The lambda version:

{ name: String -> "Hello, $name" }

You can call the lambda by passing the parameter to it in parentheses after the last curly brace:

{ name: String -> "Hello, $name" }("Matt")

It’s also possible to assign a lambda to a variable:

val hello = { name: String -> "Hello, $name" }

You can then call the variable the lambda has been assigned to, just as if it was a named function:

hello("Matt")

Lambdas provide us with a convenient way to pass logic into other functions without having to define that logic in a named function. This is very useful when processing lists or arrays of data. We’ll take a look at processing lists with lambdas in the next post!

Views: 153

Happy 10th year, JCertif!

Notes

Welcome to Codetown!

Codetown is a social network. It's got blogs, forums, groups, personal pages and more! You might think of Codetown as a funky camper van with lots of compartments for your stuff and a great multimedia system, too! Best of all, Codetown has room for all of your friends.

When you create a profile for yourself you get a personal page automatically. That's where you can be creative and do your own thing. People who want to get to know you will click on your name or picture and…
Continue

Created by Michael Levin Dec 18, 2008 at 6:56pm. Last updated by Michael Levin May 4, 2018.

Looking for Jobs or Staff?

Check out the Codetown Jobs group.

 

Enjoy the site? Support Codetown with your donation.



InfoQ Reading List

GitHub Agentic Workflows Unleash AI-Driven Repository Automation

Recently launched in technical preview, GitHub Agentic Workflows introduce a way to automate complex, repetitive repository tasks using coding agents that understand context and intent, GitHub says. This enables workflows such as automatic issue triage and labeling, documentation updates, CI troubleshooting, test improvements, and reporting.

By Sergio De Simone

Presentation: Panel: Modern Data Architectures

The panelists emphasize that data engineering is no longer just about "click-and-drag" UI tools; it is software engineering applied to data.

By Fabiane Nardon, Matthias Niehoff, Adi Polak, Sarah Usher

How Dropbox Built a Scalable Context Engine for Enterprise Knowledge Search

Dropbox engineers have detailed how the company built the context engine behind Dropbox Dash, revealing a shift toward index-based retrieval, knowledge graph-derived context, and continuous evaluation to support enterprise AI at scale

By Matt Foster

Uber and OpenAI Retool Rate Limiting Systems

Uber and OpenAI are replacing static rate limits with adaptive, infrastructure-level platforms. Uber’s Global Rate Limiter utilizes probabilistic shedding to manage 80M RPS, while OpenAI’s Access Engine implements a credit waterfall to prevent user interruptions. Both architectures utilize distributed enforcement and soft controls to maintain system stability and service continuity at scale.

By Patrick Farry

Moonshot AI Releases Open-Weight Kimi K2.5 Model with Vision and Agent Swarm Capabilities

Moonshot AI released Kimi K2.5, their latest open-weight multimodal LLM. K2.5 excels at coding tasks, with benchmark scores comparable to frontier models such as GPT-5 and Gemini. It also features an agent swarm mode, which can direct up to 100 sub-agents for attacking problems with parallel workflow.

By Anthony Alford

© 2026   Created by Michael Levin.   Powered by

Badges  |  Report an Issue  |  Terms of Service