There's a new book out called Programming F# What's F#? Why should you care? What's it gonna do for you?

Well, here's what you get when you become multi-lingual: you get more work! Are you a freelancer? Are you a little depressed with the state of the market these days? Does it blow? I can think of several similar adjectives to describe the state of affairs with opt-in work. What do I mean by opt-in? That's stuff that in-house staff can do without sacrificing year-end bonuses, holiday parties and perks to outside contractors. Given the choice, what would you do?

On the other hand, how's your Python? Check out the jobs here on the Python dot org jobs list. Ruby-ista? Do these make you feel better? And, Java dudes have some options these days, too. That's just a few languages, not to mention the .Net suite and a host of others.

Does that make you happier? How about if you're not a freelancer. You're in-house staff. You say, what's learning a new language going to do for me? Well, different languages have unique features. Pythonistas say "Life's better without braces" Ever hear that? Wonder what they're talking about?

All this stuff about functional languages is interesting. Now, Microsoft has come out with F#. What's the big deal about functional languages? One way to find out is to see some code. It truly broadens your horizons to learn new tricks. And, if you think your role is dull, try spicing it up with a new language that might cooperate with what you're running now.

Comments?

Views: 50

Happy 10th year, JCertif!

Notes

Welcome to Codetown!

Codetown is a social network. It's got blogs, forums, groups, personal pages and more! You might think of Codetown as a funky camper van with lots of compartments for your stuff and a great multimedia system, too! Best of all, Codetown has room for all of your friends.

When you create a profile for yourself you get a personal page automatically. That's where you can be creative and do your own thing. People who want to get to know you will click on your name or picture and…
Continue

Created by Michael Levin Dec 18, 2008 at 6:56pm. Last updated by Michael Levin May 4, 2018.

Looking for Jobs or Staff?

Check out the Codetown Jobs group.

 

Enjoy the site? Support Codetown with your donation.



InfoQ Reading List

Challenges and Solutions for Building Machine Learning Systems

According to Camilla Montonen, the challenges of building machine learning systems are mostly creating and maintaining the model. MLOps platforms and solutions contain components needed to build machine systems. MLOps is not about the tools; it is a culture and a set of practices. Montonen suggests that we should bridge the divide between practices of data science and machine learning engineering.

By Ben Linders

Article: Unraveling the Enigma: Debunking Myths Surrounding Lambda Cold Starts

This insightful InfoQ article dispels the common myths surrounding Lambda Cold Starts, a widely discussed topic in the serverless computing community. As serverless architectures continue to gain popularity, misconceptions about Lambda Cold Starts have proliferated, often leading to confusion and misguided optimization strategies.

By Mohit Palriwal

Presentation: Modern Compute Stack for Scaling Large AI/ML/LLM Workloads

Jules Damji discusses which infrastructure should be used for distributed fine-tuning and training, how to scale ML workloads, how to accommodate large models, and how can CPUs and GPUs be utilized?

By Jules Damji

GitHub Enables Dependabot via GitHub Actions, Improves Supply Chain Security

GitHub has released two features to improve the security and resilience of repositories. The first feature allows Dependabot to run as a GitHub Actions workflow using hosted and self-hosted runners. The second release introduces the public beta of Artifact Attestations, simplifying how repository maintainers can generate provenance for their build artifacts.

By Matt Campbell

Meta Releases Llama 3 Open-Source LLM

Meta AI released Llama 3, the latest generation of their open-source large language model (LLM) family. The model is available in 8B and 70B parameter sizes, each with a base and instruction-tuned variant. Llama3 outperforms other LLMs of the same parameter size on standard LLM benchmarks.

By Anthony Alford

© 2024   Created by Michael Levin.   Powered by

Badges  |  Report an Issue  |  Terms of Service