Direkt zum Hauptbereich

Posts

Es werden Posts vom 2021 angezeigt.

Best of both worlds

In one of my posts last year I mentioned that one can make automated comments to GitLab very easily with the right tooling - especially if they are coming from linting tools. So any author, reviewer and maintainer gets easy feedback as fast as possible on any proposed changes. This is super easy and very convenient when you're always doing a full build and every possible source in your project is actually checked. Why do we need something new, if that's working so well...? In the bitbake world that is different, we have powerful tools like sstate cache along other mechanisms to avoid exactly building everything from scratch all the time. This makes it tricky to map findings from the meta-sca layer (which fully supports sstate caching) to a pull or merge request, as we never can be sure to have the full picture. Moving from the outside, right into it So it was very clear that the commenting part of a CI pipeline needed to be done with the help of bitbake too... et voila scabot

Size really matters

 Let that title settle... and now we're getting back to a more serious issue :-). The issue When you're using bitbake layers you usually clone them on the fly when working in a cloud based setup, meaning a full clone of a repo, that could be highly expensive (just look at the size of the linux git repo for instance). As cloud based setups mostly don't supply a good way to sync those resources, unless you invent something yourself or pay for it, every bit counts. Not only as a matter of time but also as a matter of resulting cost. The layer meta-sca I maintain, has grown over the time quite much, so it became very very huge. Also because I made the mistake in the past to put large blobs (in this case tarballs) into the repository. I learned that lesson but I cannot undo it, as we all know each published git revision should stay untouched for all eternity. Mainly this is because of the linked-list nature of git - if I change one commit at the bottom I will alter any commit t

Making go not a no-go

Anyone that dealt with container engines came across go - a wonderful language, that was built to provide a right way of what C++ intended to do. The language itself is pretty straight forward and upstream poky support is given since ages... In the go world one would just run 1 2 go get github.com/foo/bar go build github.com/foo/bar and magically the go ecosystem would pull all the needed sources and build them into an executable. This is where the issues start... In the Openembedded world, one would have  one provider (aka recipe) for each dependency each recipe comes with a (remote) artifact (e.g. tarball, git repo, a.s.o.) which can be archived (so one can build the same software at a later point in time without any online connectivity) dedicated license information all this information is pretty useful when working is an environment (aka company) that has restrictions, such as reproducible builds license compliance security compliance (for instance no unpatched CVE) but when us

Finding the culprit

Working in an environment with license restrictions is always a bit challenging. Recently while migrating a larger code base I encountered the following error message  1 Package gawk cannot be installed into the image because it has incompatible license(s): GPL-3.0 hummmm, where is that actually coming from? - a quick grep through my image recipes didn't reveal anything, so it has to be something pulled in by one of the packages or even further down the line. All the package data is available in a human readable format within your workspace, still it's hard to track the relations for the above mentioned issues... So I decided to write a small script which turns this into small and way better understandable trees in the console... Et voila, I present to you dot2tree , a small script which can turn different bitbake related sources into tree printouts in your console. Let's give it a try with the above shown error. As we all know, bitbake/poky creates an image manifest,

Small but valuable: automatic cleaning the clutter

If you are working with are larger stack of recipes and update them frequently, you'll inevitably reach the point where a recipe becomes obsolete. Nothing in your stack will use it, so it basically is just a burden (and a potential hypothetical security risk). I had this situation with the insane amount of npm packages I maintain for my meta-sca layer. Those change nearly on a daily basis, with new dependencies coming in and replacing old dependencies. One could read all the change logs, but lets be honest, nobody does that but just for a small chosen few recipes - so the question remains... How to I identify obsolete recipes? Simple, by looking up all the dependencies of each recipe to another in a layer - kind of obvious isn't it :-). Lucky me, I don't have to do that manually, we are programmers, we automate stuff - so I did: the result can be found in my meta-buildutils layer - a small script called unused this one can be used without setting up bitbake at all, it ju