April 12, 2011

There has been a lot of buzz as of late around “riak_core” in various venues, so much so that we are having trouble producing enough resources and content to keep the community at bay (though we most-certainly have plans to). While we hustle to catch up, here is the rundown on what is currently available for those of you who want to learn about, look at, and play with riak_core.

(TL;DR – riak_core is the distributed systems framework that underpins Riak and is, in our opinion, what makes Riak the best and most-robust distributed datastore available today. If you want so see it in action, go download Riak and put it through its paces.)


If you know nothing about riak_core (or are in the mood for a refresher), start with the Introducing Riak Core blog post that appeared on the Basho Blog a while back. This will introduce you, at a very high-level, to what riak_core is and how it works.

Slides and Videos

There are varying degrees of overlap in each of these slides and videos, but they all address riak_core primarily.


  • riak_core repo on GitHub
  • Basho Banjo – Sample application that uses Riak Core to play distributed music
  • Try Try Try – Ryan Zezeski’s working blog that is taking an in depth look at various aspects of riak_core
  • rebar_riak_core – Rebar templates for riak_core apps from the awesome team at Webster/Clay

Getting Involved With Riak and Riak Core

We are very much as the beginning of what Riak Core can be as a stand alone platform for distributed applications, so if you want to get in at the ground floor of something that we feel is truly innovative and unparalleled, now is the time. The best way to join the conversation and to help with the development of Riak Core is to join the Riak Mailing list where you can start asking questions and sharing code.

If you want to see riak_core in action, look no further than Riak, Riak Search, and Luwak. The distribution and scaling components for all of these projects if handled by riak_core.

Also, make sure to follow the Basho Team on Twitter as we spend way too much time talking about this stuff.