Bazel is an open source build tool similar ot Make, Maven or Gradle and uses a language with a pythonesque syntax called ‘Starlark’.
It is designed at its core as a multi-language build tool, meaning that you can use it in building projects requiring multiple programming languages.
For example, it is useful in the case where you have a backend in go and a frontend in Reactsjs .
For the build of the Go backend part, we will work with a tool called ‘gazelle’ which is specific to Go .
Below are some components of the Bazel toolbox :
Bazelisk : it is a wrapper for Bazel written in Go that automatically picks a good version of Bazel for your current working directory.
Gazelle : It is file generator helping developper in creating
Gazelle understands thez dependencies and the structure of your GO project and generates and updates those
A .bazelrc file : used to specify command-line options that
Bazel should use every time it runs (not neccessary but can be useful) .
A .bazelversion file : used to specify the current version of Bazel to be used for the current project.
The aforementioned tool
Bazelisk will read this file to determine which version of
Bazel to use for the build.
A .BUILD.bazel: This is the core of any Bazel-based project.
It is a buiuld script telling
Bazel howx to build different targets(like binaries or libraries) and what their dependencies are.
A .WORKSPACE.bazel: It is usually an extension of the WORKSPACE file.
Inside it , you define rules for fetching Go dependencies or set up a docker image for the project .
Here is an example of such a WORKSPACE.bazel file used in the context of a gRPC architecture:
# in WORKSPACE.bazel, we define our workspace name, # import our version variables and import some utilities # to clone Git repositories and download archives. workspace(name = "github_com_packtpublishing_grpc_go_for_professionals") load( "//:versions.bzl", "GAZELLE_SHA256", "GAZELLE_VERSION", "GO_VERSION", "PROTOC_GEN_VALIDATE_VERSION", "PROTO_VERSION", "RULES_GO_SHA256", "RULES_GO_VERSION", ) load("@bazel_tools//tools/build_defs/repo:http.bzl", "http_archive") load("@bazel_tools//tools/build_defs/repo:git.bzl", "git_repository") # Defining the dependencies for Gazelle. http_archive( name = "bazel_gazelle", sha256 = GAZELLE_SHA256, urls = [ "https://mirror.bazel.build/github.com/bazelbuild/bazel-gazelle/releases/download/%s/bazel-gazelle-%s.tar.gz" % (GAZELLE_VERSION, GAZELLE_VERSION), "https://github.com/bazelbuild/bazel-gazelle/releases/download/%s/bazel-gazelle-%s.tar.gz" % (GAZELLE_VERSION, GAZELLE_VERSION), ], ) # Pulling the dependencies for building Go binaries, applications, and so on. http_archive( name = "io_bazel_rules_go", sha256 = RULES_GO_SHA256, urls = [ "https://mirror.bazel.build/github.com/bazelbuild/rules_go/releases/download/%s/rules_go-%s.zip" % (RULES_GO_VERSION, RULES_GO_VERSION), "https://github.com/bazelbuild/rules_go/releases/download/%s/rules_go-%s.zip" % (RULES_GO_VERSION, RULES_GO_VERSION), ], ) # Pulling dependencies of rules_go, setting the toolchain for building the go project, and telling Gazelle where to find the WORKSPACE.bazel file. load("@io_bazel_rules_go//go:deps.bzl", "go_register_toolchains", "go_rules_dependencies") load("@bazel_gazelle//:deps.bzl", "gazelle_dependencies") load("//:deps.bzl" , "go_dependencies") # Gazelle:repository_macro deps.bzl%go_dependencies go_dependencies() go_rules_dependencies() go_register_toolchains(version = GO_VERSION) gazelle_dependencies(go_repository_default_config = "//:WORKSPACE.bazel") # Protobuf git_repository( name = "com_google_protobuf", remote = "https://github.com/protocolbuffers/protobuf", tag = PROTO_VERSION, ) load("@com_google_protobuf//:protobuf_deps.bzl", "protobuf_deps") protobuf_deps() # protoc_gen_validate git_repository( name = "com_envoyproxy_protoc_gen_validate", remote = "https://github.com/bufbuild/protoc-gen-validate", tag = PROTOC_GEN_VALIDATE_VERSION, ) load("@com_envoyproxy_protoc_gen_validate//bazel:repositories.bzl", "pgv_dependencies") load("@com_envoyproxy_protoc_gen_validate//:dependencies.bzl", "go_third_party") pgv_dependencies() # gazelle:repository_macro deps.bzl%go_third_party go_third_party
The Bazel philosophy Link to heading
In a monorepo architecture, we can have many different subprojects with multiple languages contained within.
As eluded to earlier, while I primarily code in go , I could add an angular application or some python tools or more.
It is then crucial that dependency management , compilation and builds remain reproducible and simple for all the developpers on the team and for the entire lifecycle of the codebase.
Bazel ensures hermeticity , meaning that builds are essentially extremely sandboxed, pinned with proper dependencies, Go versions and will be guaranteed.. to behave the same despite on which machine it is run. This pro is a major reason why Bael is used by big projects over the Go native build tooling.
In addition to hermiticity, Bazel provides :
- Advanced local caching :
-Incremental builds: Bazel only re-builds parts of your project that have been modified, rather than re-building the entire project.
This is particularly useful for large codebases whera full build could take a singnicant amount of time. -Fine-grained dependency analysis: Bazel understands the dependencies between different parts of your code.
If you change a single line in a single file, Bazel will only rebuild the targets depending on this particular file, making the build faster. -Local caching: Build artifacts arec ached locally, so if you switch between git branches or rebuild the same code,
Bazel will retrieve the previously build artifacts from the cache.
- Remote caching
-Distributed builds : Bazel can distribute build tasks across multiple machines , useful in a CI environment where we might have multiple build agents.
-Shared cache : The cache can be shared among various developpers and CI servers. This means that , if one developper has already built a particular target,
another developper (or CI server) can re-used the cached artifact instead of re-building it.
-Cache invalidation: Bazel is smart enought o know when an artifact is out-of-date and needs to be re-built, ensureing developpers are always.. working with the most actual code.
Additional useful readings :