chore: run pre-commit

Signed-off-by: Matej Focko <me@mfocko.xyz>
This commit is contained in:
Matej Focko 2024-01-03 19:38:35 +01:00
parent e63dbba74a
commit 2cf4a3efba
Signed by: mfocko
GPG key ID: 7C47D46246790496
36 changed files with 647 additions and 478 deletions

2
.gitignore vendored
View file

@ -30,4 +30,4 @@ static/files/**/*.svg
static/files/**/*.png
# ignore the VSCode crap
/.vscode
/.vscode

View file

@ -29,21 +29,22 @@ any variables, we can use just backtracking and recursion.
Let's start with some brainstorming of the solution.
* How will I know what I've checked without any variables?
* _answer_: recursion will need to take care of that, cause I'm not allowed
- How will I know what I've checked without any variables?
- _answer_: recursion will need to take care of that, cause I'm not allowed
anything else
* How will I pass around the fact I've found the exit?
* _answer_: I can return values from helper functions, so I should be able to
- How will I pass around the fact I've found the exit?
- _answer_: I can return values from helper functions, so I should be able to
indicate _found_/_not found_
* How is the exit marked?
* _answer_: there is one “beeper” as a mark
* Can I reduce my problem somehow?
* _answer_: I could check each possible direction as a reduced search space
- How is the exit marked?
- _answer_: there is one “beeper” as a mark
- Can I reduce my problem somehow?
- _answer_: I could check each possible direction as a reduced search space
## »Rough« pseudocode
We should be able to construct a _skeleton_ of our solution at least. Pseudocode
follows:
```ruby
def find_exit
if found the exit then
@ -67,6 +68,7 @@ In the proper pseudocode we will need to dive into the technical details like
the way we check for exit, move around, etc.
We can start by cleaning up and decomposing the function written above:
```ruby
def find_exit
# BASE: found exit
@ -117,11 +119,13 @@ end
```
We are missing few of the functions that we use in our pseudocode above:
* `found_exit()`
* `turn_around()`
* `turn_right()`
- `found_exit()`
- `turn_around()`
- `turn_right()`
We can implement those easily:
```ruby
def found_exit
if not beepers_present() then
@ -154,6 +158,7 @@ Now we have everything ready for implementing it in Python.
## Actual implementation
It's just a matter of rewriting the pseudocode into Python[^1]:
```py
class SuperKarel(Karel):
# you can define your own helper functions on Karel here, if you wish to
@ -223,6 +228,7 @@ class SuperKarel(Karel):
We have relatively repetitive code for checking each of the directions, I would
propose to refactor a bit, in a fashion of checkin just forward, so it's more
readable:
```py
def find_exit(self) -> bool:
if self.found_exit():
@ -249,6 +255,7 @@ We can also notice that turning around takes 2 left turns and turning to right
does 3. We get 5 left turns in total when we turn around and right afterwards…
Taking 4 left turns just rotates us back to our initial direction, therefore it
is sufficient to do just one left turn (`5 % 4 == 1`). That way we get:
```py
def find_exit(self) -> bool:
if self.found_exit():
@ -280,6 +287,7 @@ solution that we haven't spoken about.
We are silently expecting the maze **not to** have any loops. Example of such
maze can be the `maze666.kw`:
```
┌─────────┬─┐
│. . . . .│.│
@ -298,6 +306,7 @@ maze can be the `maze666.kw`:
If you try running our solution on this map, Karel just loops and never finds
the solution. Let's have a look at the loop he gets stuck in:
```
┌─────────┬─┐
│* * * * *│.│
@ -332,6 +341,7 @@ to mark the “cells” that we have tried. We can easily use beepers for this,
we need to be careful **not to** confuse the exit with already visited cell.
To do that we'll use **2** beepers instead of the one. Implementation follows:
```py
def visited(self) -> bool:
if not self.beepers_present():
@ -384,4 +394,4 @@ def find_exit(self) -> bool:
Now our solution works also for mazes that have loops.
[^1]: which is usually very easy matter
[^1]: which is usually very easy matter

View file

@ -6,10 +6,10 @@ description: |
Solving the shortest path problem with a naïve approach that turns into
something.
tags:
- cpp
- brute force
- bellman ford
- dynamic programming
- cpp
- brute force
- bellman ford
- dynamic programming
last_update:
date: 2024-01-01
---
@ -33,6 +33,7 @@ highest possible price) and try to improve what we've gotten so far until there
are no improvements. That sounds fine, we shall implement this. Since we are
going on repeat, we will name this function `bf()` as in _brute-force_, cause it
is trying to find it the hard way:
```cpp
const static std::vector<vertex_t> DIRECTIONS =
std::vector{std::make_pair(0, 1), std::make_pair(0, -1),
@ -106,6 +107,7 @@ finding the path `u → x1 → … → xn → v` to subproblems
_Is our solution correct?_ It appears to be correct… We have rather complicated
map and our algorithm has finished in an instant with the following output:
```
Normal cost: 1
Vortex cost: 5
@ -130,6 +132,7 @@ one path skipping the `*` cells, since they cost more than going around.
We can play around a bit with it. The `*` cells can be even vortices that pull
you in with a negative price and let you _propel_ yourself out :wink: Let's
change their cost to `-1` then. Let's check what's the fastest path to the cell.
```
Normal cost: 1
Vortex cost: -1
@ -175,6 +178,7 @@ the repeating patterns algorithmically.
On the other hand, we can approach this from the different perspective. Let's
assume the worst-case scenario (generalized for any graph):
> Let $K_n$ be complete graph. Let $P$ be the shortest path from $v_1$ to $v_n$
> such that $P$ has $n - 1$ edges, i.e. the shortest path between the two chosen
> vertices visits all vertices (not necessarily in order) and has the lowest
@ -194,6 +198,7 @@ assume the worst-case scenario (generalized for any graph):
_How can we leverage this?_ We will go through the edges only as many times as
cells we have. Let's adjust the code to fix the looping:
```cpp
auto bf_finite(const graph& g, const vertex_t& source,
const vertex_t& destination) -> int {
@ -242,6 +247,7 @@ auto bf_finite(const graph& g, const vertex_t& source,
```
And we get the following result:
```
Normal cost: 1
Vortex cost: -1
@ -291,6 +297,7 @@ exception that it doesn't report whether there are any negative cycles, it just
ends.
Let's have a look at a proper implementation of the Bellman-Ford algorithm:
```cpp
auto bellman_ford(const graph& g, const vertex_t& source)
-> std::vector<std::vector<int>> {
@ -364,6 +371,7 @@ auto bellman_ford(const graph& g, const vertex_t& source)
```
And if we run it with our negative cost of entering vortices:
```
[Bellman-Ford] Found a negative loop
[Bellman-Ford] Cost: -240
@ -399,12 +407,14 @@ Let's have a short look at the time complexities of the presented algorithms:
1. naïve approach: given that there are no negative loops, we are bound by the
worst-case ordering of the relaxations which results in
$$
\mathcal{O}(\vert V \vert \cdot \vert E \vert)
$$
2. our naïve approach with the fixed count of iterations instead of the
`do-while` loop results in the same worst-case time complexity:
$$
\Theta(\vert V \vert \cdot \vert E \vert)
$$
@ -418,6 +428,7 @@ Let's have a short look at the time complexities of the presented algorithms:
Since we are literally copy-pasting the body of the loops just for the sake of
relaxing, we can factor that part out into a separate function:
```cpp
static auto _check_vertex(const graph& g,
std::vector<std::vector<int>>& distances, int v,
@ -461,6 +472,7 @@ there would be any edge relaxed instead of performing the relaxation itself.
Then we can also see the differences between the specific versions of our
path-finding algorithms in a clear way:
```cpp
auto bf(const graph& g, const vertex_t& source, const vertex_t& destination)
-> int {
@ -566,8 +578,9 @@ consider it a brute-force algorithm.
:::
[^1]: [Breadth-first search](https://en.wikipedia.org/wiki/Breadth-first_search)
[^2]: Of course, there are some technicalities like keeping track of the visited
vertices to not taint the shortest path by already visited vertices.
[^3]: or at least you should, LOL
[^2]:
Of course, there are some technicalities like keeping track of the visited
vertices to not taint the shortest path by already visited vertices.
[^3]: or at least you should, LOL

View file

@ -5,10 +5,10 @@ title: Dijkstra's algorithm
description: |
Moving from Bellman-Ford into the Dijsktra's algorithm.
tags:
- cpp
- dynamic programming
- greedy
- dijkstra
- cpp
- dynamic programming
- greedy
- dijkstra
last_update:
date: 2024-01-03
---
@ -46,6 +46,7 @@ I'll start with a well-known meme about Dijkstra's algorithm:
![Dijkstra's algorithm meme](/img/algorithms/paths/bf-to-astar/dijkstra-meme.jpg)
And then follow up on that with the actual backstory from Dijkstra himself:
> What is the shortest way to travel from Rotterdam to Groningen, in general:
> from given city to given city. It is the algorithm for the shortest path,
> which I designed in about twenty minutes. One morning I was shopping in
@ -75,6 +76,7 @@ monotonically non-decreasing changes in the costs of shortest paths.
## Short description
Let's have a brief look at the pseudocode taken from the Wikipedia:
```
function Dijkstra(Graph, source):
for each vertex v in Graph.Vertices:
@ -141,6 +143,7 @@ Firstly we need to have some priority queue wrappers. C++ itself offers
functions that can be used for maintaining max heaps. They also have generalized
version with any ordering, in our case we need reverse ordering, because we need
the min heap.
```cpp
using pqueue_item_t = std::pair<int, vertex_t>;
using pqueue_t = std::vector<pqueue_item_t>;
@ -165,6 +168,7 @@ auto popq(pqueue_t& q) -> std::optional<pqueue_item_t> {
And now we can finally move to the actual implementation of the Dijkstra's
algorithm:
```cpp
auto dijkstra(const graph& g, const vertex_t& source)
-> std::vector<std::vector<int>> {
@ -220,6 +224,7 @@ structure.
The original implementation doesn't leverage the heap which results in
repetitive _look up_ of the “closest” vertex, hence we get the following
worst-case time complexity in the _Bachmann-Landau_ notation:
$$
\Theta(\vert V \vert^2)
$$
@ -227,6 +232,7 @@ $$
If we turn our attention to the backing data structure, we always want the
“cheapest” vertex, that's why we can use the min heap, given that we use
Fibonacci heap we can achieve the following amortized time complexity:
$$
\mathcal{O}(\vert E \vert + \vert V \vert \cdot \log{\vert V \vert})
$$
@ -242,6 +248,7 @@ min or max).
## Running the Dijkstra
Let's run our code:
```
Normal cost: 1
Vortex cost: 5
@ -273,6 +280,7 @@ infinitely when you have negative weights or loops in the graph. Well, if we use
our _propelling vortices_, not only we have the negative weights, but also the
negative loops. Let's run our code! Our first naïve approach was actually
looping:
```
Normal cost: 1
Vortex cost: -1

View file

@ -5,9 +5,9 @@ title: A* algorithm
description: |
Moving from Dijkstra's algorithm into the A* algorithm.
tags:
- cpp
- dynamic programming
- astar
- cpp
- dynamic programming
- astar
last_update:
date: 2024-01-03
---
@ -15,6 +15,7 @@ last_update:
## Intro
Let's start by the recap of what we've achieved so far:
1. We have implemented a naïve brute-force algorithm that tries to relax paths
as long as there are any paths to be relaxed.
2. Then we have fixed an issue caused by negative loops that can result in
@ -43,9 +44,9 @@ The important question here is how to _influence_ the algorithm, so that it does
choose the path that _makes more sense_ rather than the one that costs the
least.
## A* description
## A\* description
The _A* algorithm_ can be considered a modification of Dijkstra's algorithm. The
The _A\* algorithm_ can be considered a modification of Dijkstra's algorithm. The
cost is still the same, we cannot change it, right? However when we pick the
vertices from the heap, we can influence the order by some _heuristic_. In this
case, we introduce a function that can suggest how feasible the vertex is.
@ -66,6 +67,7 @@ road makes us 50 km away and using the other road we will be 200 km away.
Our map is a bit simpler, but we can use a very similar principle. We will use
the _Manhattan distance_, which is defined in a following way:
$$
\vert x_a - x_b \vert + \vert y_a - y_b \vert
$$
@ -81,6 +83,7 @@ calculate the shortest path and pass the heuristic as a parameter.
## Implementation
Actual implementation is very easy once we have the Dijkstra's algorithm:
```cpp
auto astar(const graph& g, const vertex_t& source, const auto& h)
-> std::vector<std::vector<int>> {
@ -132,6 +135,7 @@ auto astar(const graph& g, const vertex_t& source, const auto& h)
## Running on our map
For this algorithm I will also show the example of a call:
```cpp
distances = astar(g, std::make_pair(1, 9), [](const auto& u) {
auto [x, y] = u;
@ -145,6 +149,7 @@ source vertex where we start. And finally the lambda returns
_Manhattan distance_ to the goal.
And we get the following result:
```
Normal cost: 1
Vortex cost: 5
@ -171,6 +176,7 @@ Graph:
Now you may wonder how does it compare to the previous algorithms. Supposedly it
should be faster. Let's add counters and debugging output when we update
distance to our goal. And now if we run our code, we get the following output:
```
Normal cost: 1
Vortex cost: 5

View file

@ -5,13 +5,13 @@ title: From BF to A*
description: |
Figuring out shortest-path problem from the BF to the A* algorithm.
tags:
- cpp
- brute force
- bellman ford
- dynamic programming
- dijkstra
- greedy
- a star
- cpp
- brute force
- bellman ford
- dynamic programming
- dijkstra
- greedy
- a star
last_update:
date: 2024-01-01
---
@ -24,6 +24,7 @@ algorithms, we will use a 2D map with some rules that will allow us to show cons
and pros of the shown algorithms.
Let's have a look at the example map:
```
#############
#..#..*.*.**#
@ -39,22 +40,26 @@ Let's have a look at the example map:
```
We can see three different kinds of cells:
1. `#` which represent walls, that cannot be entered at all
2. `*` which represent vortices that can be entered at the cost of 5 coins
3. `.` which represent normal cells that can be entered for 1 coin (which is the
base price of moving around the map)
Let's dissect a specific position on the map to get a better grasp of the rules:
```
.
#S*
.
```
We are standing in the cell marked with `S` and we have the following options
* move to the north (`.`) with the cost of 1 coin,
* move to the west (`#`) **is not** allowed because of the wall,
* move to the east (`*`) is allowed with the cost of 5 coins, and finally
* move to the south (`.`) with the cost of 1 coin.
- move to the north (`.`) with the cost of 1 coin,
- move to the west (`#`) **is not** allowed because of the wall,
- move to the east (`*`) is allowed with the cost of 5 coins, and finally
- move to the south (`.`) with the cost of 1 coin.
:::info
@ -67,13 +72,15 @@ Further on I will follow the same scheme for marking cells with an addition of
For working with this map I have prepared a basic structure for the graph in C++
that will abstract some of the internal workings of our map, namely:
* remembers the costs of moving around
* provides a simple function that returns price for moving **directly** between
- remembers the costs of moving around
- provides a simple function that returns price for moving **directly** between
two positions on the map
* allows us to print the map out, just in case we'd need some adjustments to be
- allows us to print the map out, just in case we'd need some adjustments to be
made
We can see the `graph` header here:
```cpp
#ifndef _GRAPH_HPP
#define _GRAPH_HPP

View file

@ -3,14 +3,14 @@ title: How can Copr help with broken dependencies
description: Copr comes to save you when maintainer doesn't care.
date: 2023-08-02
authors:
- key: mf
title: a.k.a. your opinionated admin
- key: mf
title: a.k.a. your opinionated admin
tags:
- 🏭
- red-hat
- copr
- admin
- vps
- 🏭
- red-hat
- copr
- admin
- vps
---
When you decide to run Fedora on your VPS, you might get screwed over by using
@ -37,10 +37,12 @@ If you have ever used Ubuntu, you might be familiar with the concept since it is
very close to [PPAs](https://help.ubuntu.com/community/PPA).
tl;dr of the whole process consists of
1. enabling the Copr repository, and
2. installing the desired package.
So in shell you would do
```
# dnf copr enable copr-repository
# dnf install package-from-the-repository
@ -110,6 +112,7 @@ Copr is heavily used for testing builds on the upstream with
I have described above), if need be.
[^1]: [vpsFree.cz](https://vpsfree.cz)
[^2]: Even though I've been running archLinux on some Raspberry Pi's and also
on one of my “home servers”, before getting the VPS. You could say I like
to live on the edge…
[^2]:
Even though I've been running archLinux on some Raspberry Pi's and also
on one of my “home servers”, before getting the VPS. You could say I like
to live on the edge…

View file

@ -5,9 +5,9 @@ date: 2022-12-14T21:45
slug: aoc-2022/intro
authors: mf
tags:
- advent-of-code
- advent-of-code-2022
- rust
- advent-of-code
- advent-of-code-2022
- rust
hide_table_of_contents: false
---
@ -58,6 +58,7 @@ Since we are using Rust, we are going to use a [Cargo] and more than likely VSCo
with [`rust-analyzer`]. Because of my choice of libraries, we will also introduce
a `.envrc` file that can be used by [`direnv`], which allows you to set specific
environment variables when you enter a directory. In our case, we will use
```bash
# to show nice backtrace when using the color-eyre
export RUST_BACKTRACE=1
@ -69,6 +70,7 @@ export RUST_LOG=trace
And for the one of the most obnoxious things ever, we will use a script to download
the inputs instead of “_clicking, opening and copying to a file_”[^1]. There is
no need to be _fancy_, so we will adjust Python script by Martin[^2].
```py
#!/usr/bin/env python3
@ -116,6 +118,7 @@ if __name__ == "__main__":
If the script is called without any arguments, it will deduce the day from the
system, so we do not need to change the day every morning. It also requires a
configuration file:
```yaml
# env.yaml
session: your session cookie
@ -159,6 +162,7 @@ parsing and one for 2D vector (that gets used quite often during Advent of Code)
Key part is, of course, processing the input and my library exports following
functions that get used a lot:
```rust
/// Reads file to the string.
pub fn file_to_string<P: AsRef<Path>>(pathname: P) -> String;
@ -195,6 +199,7 @@ need.
We can also prepare a template to quickly bootstrap each of the days. We know
that each puzzle has 2 parts, which means that we can start with 2 functions that
will solve them.
```rust
fn part1(input: &Input) -> Output {
todo!()
@ -210,6 +215,7 @@ of puzzles, it is the same type). `todo!()` can be used as a nice placeholder,
it also causes a panic when reached and we could also provide some string with
an explanation, e.g. `todo!("part 1")`. We have not given functions a specific
type and to avoid as much copy-paste as possible, we will introduce type aliases.
```rust
type Input = String;
type Output = i32;
@ -226,6 +232,7 @@ For each day we get a personalized input that is provided as a text file. Almost
all the time, we would like to get some structured type out of that input, and
therefore it makes sense to introduce a new function that will provide the parsing
of the input.
```rust
fn parse_input(path: &str) -> Input {
todo!()
@ -237,6 +244,7 @@ sample instead of input.
OK, so now we can write a `main` function that will take all of the pieces and
run them.
```rust
fn main() {
let input = parse_input("inputs/dayXX.txt");
@ -249,6 +257,7 @@ fn main() {
This would definitely do :) But we have installed a few libraries and we want to
use them. In this part we are going to utilize _[`tracing`]_ (for tracing, duh…)
and _[`color-eyre`]_ (for better error reporting, e.g. from parsing).
```rust
fn main() -> Result<()> {
tracing_subscriber::fmt()
@ -274,11 +283,13 @@ The first statement will set up tracing and configure it to print out the logs t
terminal, based on the environment variable. We also change the formatting a bit,
since we do not need all the _fancy_ features of the logger. Pure initialization
would get us logs like this:
```
2022-12-11T19:53:19.975343Z INFO day01: Part 1: 0
```
However after running that command, we will get the following:
```
INFO src/bin/day01.rs:35: Part 1: 0
```
@ -296,6 +307,7 @@ at the end of the `::install` which _unwraps_ the **»result«** of the installa
:::
Overall we will get to a template like this:
```rust
use aoc_2022::*;
@ -338,15 +350,18 @@ fn main() -> Result<()> {
}
```
[^1]: Copy-pasting might be a relaxing thing to do, but you can also discover
nasty stuff about your PC. See [this Reddit post and the comment].
[^2]: [GitHub profile](https://github.com/martinjonas)
[^3]: Even though you can use it even for libraries, but handling errors from
libraries using `anyhow` is nasty… You will be the stinky one ;)
[^1]:
Copy-pasting might be a relaxing thing to do, but you can also discover
nasty stuff about your PC. See [this Reddit post and the comment].
[_Advent of Code_]: https://adventofcode.com
[GitLab]: https://gitlab.com/mfocko/advent-of-code-2022
[Cargo]: https://doc.rust-lang.org/cargo/
[^2]: [GitHub profile](https://github.com/martinjonas)
[^3]:
Even though you can use it even for libraries, but handling errors from
libraries using `anyhow` is nasty… You will be the stinky one ;)
[_advent of code_]: https://adventofcode.com
[gitlab]: https://gitlab.com/mfocko/advent-of-code-2022
[cargo]: https://doc.rust-lang.org/cargo/
[`rust-analyzer`]: https://rust-analyzer.github.io/
[`direnv`]: https://direnv.net/
[`tracing`]: https://crates.io/crates/tracing
@ -357,4 +372,4 @@ libraries using `anyhow` is nasty… You will be the stinky one ;)
[`regex`]: https://crates.io/crates/regex
[`lazy_static`]: https://crates.io/crates/lazy_static
[`itertools`]: https://crates.io/crates/itertools
[this Reddit post and the comment]: https://www.reddit.com/r/adventofcode/comments/zb98pn/comment/iyq0ono
[this reddit post and the comment]: https://www.reddit.com/r/adventofcode/comments/zb98pn/comment/iyq0ono

View file

@ -5,9 +5,9 @@ date: 2022-12-15T01:15
slug: aoc-2022/1st-week
authors: mf
tags:
- advent-of-code
- advent-of-code-2022
- rust
- advent-of-code
- advent-of-code-2022
- rust
hide_table_of_contents: false
---
@ -43,6 +43,7 @@ handle samples. With each puzzle you usually get a sample input and expected
output. You can use them to verify that your solution works, or usually doesn't.
At first I've decided to put asserts into my `main`, something like
```rust
assert_eq!(part_1(&sample), 24000);
info!("Part 1: {}", part_1(&input));
@ -53,6 +54,7 @@ info!("Part 2: {}", part_2(&input));
However, once you get further, the sample input may take some time to run itself.
So in the end, I have decided to turn them into unit tests:
```rust
#[cfg(test)]
mod tests {
@ -124,6 +126,7 @@ Fighting the compiler took me 30 minutes.
We need to find a common item among 2 collections, that's an easy task, right?
We can construct 2 sets and find an intersection:
```rust
let top: HashSet<i32> = [1, 2, 3].iter().collect();
let bottom: HashSet<i32> = [3, 4, 5].iter().collect();
@ -133,6 +136,7 @@ Now, the first issue that we encounter is caused by the fact that we are using
a slice (the `[…]`), iterator of that returns **references** to the numbers.
And we get immediately yelled at by the compiler, because the numbers are discarded
after running the `.collect`. To fix this, we can use `.into_iter`:
```rust
let top: HashSet<i32> = [1, 2, 3].into_iter().collect();
let bottom: HashSet<i32> = [3, 4, 5].into_iter().collect();
@ -140,9 +144,11 @@ let bottom: HashSet<i32> = [3, 4, 5].into_iter().collect();
This way the numbers will get copied instead of referenced. OK, let's find the
intersection of those 2 collections:
```rust
println!("Common elements: {:?}", top.intersection(&bottom));
```
```
Common elements: [3]
```
@ -161,6 +167,7 @@ that should be fairly easy, we have an intersection and we want to find intersec
over all of them.
Let's have a look at the type of the `.intersection`
```rust
pub fn intersection<'a>(
    &'a self,
@ -169,11 +176,13 @@ pub fn intersection<'a>(
```
OK… Huh… But we have an example there!
```rust
let intersection: HashSet<_> = a.intersection(&b).collect();
```
Cool, that's all we need.
```rust
let top: HashSet<i32> = [1, 2, 3, 4].into_iter().collect();
let bottom: HashSet<i32> = [3, 4, 5, 6].into_iter().collect();
@ -183,11 +192,13 @@ let bottom_2: HashSet<i32> = [4, 5, 6].into_iter().collect();
let intersection: HashSet<_> = top.intersection(&bottom).collect();
println!("Intersection: {:?}", intersection);
```
```
Intersection: {3, 4}
```
Cool, so let's do the intersection with the `top_2`:
```rust
let top: HashSet<i32> = [1, 2, 3, 4].into_iter().collect();
let bottom: HashSet<i32> = [3, 4, 5, 6].into_iter().collect();
@ -200,6 +211,7 @@ println!("Intersection: {:?}", intersection);
```
And we get yelled at by the compiler:
```
error[E0308]: mismatched types
--> src/main.rs:10:58
@ -228,6 +240,7 @@ making sure you're not doing something naughty that may cause an **undefined**
:::
To resolve this we need to get an iterator that **clones** the elements:
```rust
let top: HashSet<i32> = [1, 2, 3, 4].into_iter().collect();
let bottom: HashSet<i32> = [3, 4, 5, 6].into_iter().collect();
@ -239,6 +252,7 @@ let intersection: HashSet<_> = intersection.intersection(&top_2).cloned().collec
let intersection: HashSet<_> = intersection.intersection(&bottom_2).cloned().collect();
println!("Intersection: {:?}", intersection);
```
```
Intersection: {4}
```
@ -273,11 +287,12 @@ Let's play with stacks of crates.
:::
Very easy problem with very annoying input. You can judge yourself:
```
[D]
[N] [C]
[D]
[N] [C]
[Z] [M] [P]
1 2 3
1 2 3
move 1 from 2 to 1
move 3 from 1 to 3
@ -287,7 +302,6 @@ move 1 from 1 to 2
Good luck transforming that into something reasonable :)
:::tip Fun fact
Took me 40 minutes to parse this reasonably, including fighting the compiler.
@ -300,6 +314,7 @@ For the initial solution I went with a manual solution (as in _I have done all_
_the work_. Later on I have decided to explore the `std` and interface of the
`std::vec::Vec` and found [`split_off`] which takes an index and splits (duh)
the vector:
```rust
let mut vec = vec![1, 2, 3];
let vec2 = vec.split_off(1);
@ -343,11 +358,12 @@ directories that take a lot of space and should be deleted.
:::
> I was waiting for this moment, and yet it got me!
> *imagine me swearing for hours*
> _imagine me swearing for hours_
### Solution
We need to “_build_” a file system from the input that is given in a following form:
```
$ cd /
$ ls
@ -417,6 +433,7 @@ to have `Rc<RefCell<T>>`.
So, how are we going to represent the file system then? We will use an enumeration,
hehe, which is an algebraic data type that can store some stuff in itself :weary:
```rust
type FileHandle = Rc<RefCell<AocFile>>;
@ -433,6 +450,7 @@ out the value of that enumeration, it's derived, so it's not as good as if we ha
implemented it ourselves, but it's good enough for debugging, hence the name.
Now to the fun part! `AocFile` value can be represented in two ways:
- `File(usize)`, e.g. `AocFile::File(123)` and we can pattern match it, if we
need to
- `Directory(BTreeMap<String, FileHandle>)` will represent the directory and will
@ -491,13 +509,13 @@ in Rust, we can say that
You can easily see that they only differ in the mutability. (And even that is not
as simple as it seems, because there is also `Cell<T>`)
[_Advent of Code_]: https://adventofcode.com
[GitLab]: https://gitlab.com/mfocko/advent-of-code-2022
[_advent of code_]: https://adventofcode.com
[gitlab]: https://gitlab.com/mfocko/advent-of-code-2022
[`/src/bin/`]: https://gitlab.com/mfocko/advent-of-code-2022/-/tree/main/src/bin
[`sccache`]: https://github.com/mozilla/sccache
[`RangeInclusive`]: https://doc.rust-lang.org/std/ops/struct.RangeInclusive.html
[`rangeinclusive`]: https://doc.rust-lang.org/std/ops/struct.RangeInclusive.html
[`split_off`]: https://doc.rust-lang.org/std/vec/struct.Vec.html#method.split_off
[`du`]: https://www.man7.org/linux/man-pages/man1/du.1.html
[`HashMap`]: https://doc.rust-lang.org/std/collections/struct.HashMap.html
[`BTreeMap`]: https://doc.rust-lang.org/std/collections/struct.BTreeMap.html
[`hashmap`]: https://doc.rust-lang.org/std/collections/struct.HashMap.html
[`btreemap`]: https://doc.rust-lang.org/std/collections/struct.BTreeMap.html
[_tree catamorphism_]: https://en.wikipedia.org/wiki/Catamorphism#Tree_fold

View file

@ -5,9 +5,9 @@ date: 2022-12-25T23:15
slug: aoc-2022/2nd-week
authors: mf
tags:
- advent-of-code
- advent-of-code-2022
- rust
- advent-of-code
- advent-of-code-2022
- rust
hide_table_of_contents: false
---
@ -109,6 +109,7 @@ break in multiple places at once. I'll get back to it…
:::
Let's split it in multiple parts:
- `v: &'a [Vec<U>]` represents the 2D `Vec`, we are indexing, `Vec` implements
`Slice` trait and _clippy_ recommends using `&[T]` to `&Vec<T>`, exact details
are unknown to me
@ -128,6 +129,7 @@ Let's split it in multiple parts:
First issue that our implementation has is the fact that we cannot get a mutable
reference out of that function. This could be easily resolved by introducing new
function, e.g. `index_mut`. Which I have actually done while writing this part:
```rust
pub fn index_mut<'a, T, U>(v: &'a mut [Vec<U>], idx: &Vector2D<T>) -> &'a mut U
where
@ -153,6 +155,7 @@ types for indexing “built-in” types.
Another part of this rabbit hole is trait `SliceIndex<T>` that is of a relevance
because of
```rust
impl<T, I> Index<I> for [T]
where
@ -177,6 +180,7 @@ and is marked as `unsafe`.
Another problem is a requirement for indexing either `[Vec<T>]` or `Vec<Vec<T>>`.
This requirement could be countered by removing inner type `Vec<T>` and constraining
it by a trait `Index` (or `IndexMut` respectively) in a following way
```rust
pub fn index<'a, C, T>(v: &'a [C], idx: &Vector2D<T>) -> &'a C::Output
where
@ -203,6 +207,7 @@ that you can use to your advantage; you can easily guess how).
So how can we approach this then? Well… we will convert the bounds instead of
the indices and that lead us to:
```rust
pub fn in_range<T, U>(v: &[Vec<U>], idx: &Vector2D<T>) -> bool
where
@ -225,6 +230,7 @@ You can tell that it's definitely a shitty code. Let's improve it now! We will
get back to the original idea, but do it better. We know that we cannot convert
negative values into `usize`, **but** we also know that conversion like that
returns a `Result<T, E>` which we can use to our advantage.
```rust
pub fn in_range<T, U>(v: &[Vec<U>], idx: &Vector2D<T>) -> bool
where
@ -247,6 +253,7 @@ method returns `Result<T, E>`.
We call `and_then` on that _result_, let's have a look at the type signature of
`and_then`, IMO it explains more than enough:
```rust
pub fn and_then<U, F>(self, op: F) -> Result<U, E>
where
@ -297,6 +304,7 @@ generates a lot of boilerplate. And even though it can be easily copied, it's
just a waste of space and unnecessary code. Let's “simplify” this (on one end
while creating monster on the other end). I've gone through what we need in the
preparations for the AoC. Let's sum up our requirements:
- parsing
- part 1 & 2
- running on sample / input
@ -307,6 +315,7 @@ cannot do anything about it. However running and testing can be simplified!
Let's introduce and export a new module `solution` that will take care of all of
this. We will start by introducing a trait for each day.
```rust
pub trait Solution<Input, Output: Display> {
fn parse_input<P: AsRef<Path>>(pathname: P) -> Input;
@ -322,6 +331,7 @@ implement the `Solution` trait.
Now we need to get rid of the boilerplate, we can't get rid of the `main` function,
but we can at least move out the functionality.
```rust
fn run(type_of_input: &str) -> Result<()>
where
@ -382,6 +392,7 @@ advised to use it any production code.
And now we can get to the nastiest stuff :weary: We will **generate** the tests!
We want to be able to generate tests for sample input in a following way:
```rust
test_sample!(day_01, Day01, 42, 69);
```
@ -420,6 +431,7 @@ parameters have their name prefixed with `$` sign and you can define various “
of your macro. Let's go through it!
We have following parameters:
- `$mod_name` which represents the name for the module with tests, it is typed
with `ident` which means that we want a valid identifier to be passed in.
- `$day_struct` represents the structure that will be used for tests, it is typed
@ -429,6 +441,7 @@ We have following parameters:
Apart from that we need to use `#[macro_export]` to mark the macro as exported
for usage outside of the module. Now our skeleton looks like:
```rust
use aoc_2022::*;
@ -476,6 +489,7 @@ And the issue is caused by different types of `Output` for the part 1 and part 2
Problem is relatively simple and consists of simulating a CPU, I have approached
it in a following way:
```rust
fn evaluate_instructions(instructions: &[Instruction], mut out: Output) -> Output {
instructions
@ -500,6 +514,7 @@ have an `enumeration` that can _bear_ some other values apart from the type itse
We could've seen something like this with the `Result<T, E>` type that can be
defined as
```rust
enum Result<T, E> {
Ok(T),
@ -512,6 +527,7 @@ enum Result<T, E> {
When we have an `Ok` value, it has the result itself, and when we get an `Err`
value, it has the error. This also allows us to handle _results_ in a rather
pretty way:
```rust
match do_something(x) {
Ok(y) => {
@ -526,6 +542,7 @@ match do_something(x) {
:::
My solution has a following outline:
```rust
fn execute(&self, i: &Instruction, output: &mut Output) -> State {
// execute the instruction
@ -586,6 +603,7 @@ also rolling down the hill…
As I have said in the _tl;dr_, we are looking for the shortest path, but the start
and goal differ for the part 1 and 2. So I have decided to refactor my solution
to a BFS algorithm that takes necessary parameters via functions:
```rust
fn bfs<F, G>(
graph: &[Vec<char>], start: &Position, has_edge: F, is_target: G
@ -621,6 +639,7 @@ Processing packets with structured data from the distress signal.
You can implement a lot of traits if you want to. It is _imperative_ to implement
ordering on the packets. I had a typo, so I also proceeded to implement a `Display`
trait for debugging purposes:
```rust
impl Display for Packet {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
@ -765,13 +784,14 @@ Why? We have it implemented for the slices (`[C]`), why doesn't it work? Well,
the fun part consists of the fact that in other place, where we were using it,
we were passing the `&[Vec<T>]`, but this is coming from a helper functions that
take `&Vec<Vec<T>>` instead. And… we don't implement `Index` and `IndexMut` for
those. Just for the slices. :exploding_head: *What are we going to do about it?*
those. Just for the slices. 🤯 _What are we going to do about it?_
We can either start copy-pasting or be smarter about it… I choose to be smarter,
so let's implement a macro! The only difference across the implementations are
the types of the outer containers. Implementation doesn't differ **at all**!
Implementing the macro can be done in a following way:
```rust
macro_rules! generate_indices {
($container:ty) => {
@ -807,6 +827,7 @@ macro_rules! generate_indices {
```
And now we can simply do
```rust
generate_indices!(VecDeque<C>);
generate_indices!([C]);
@ -830,6 +851,7 @@ copy-paste, cause the cost of this “monstrosity” outweighs the benefits of n
This issue is relatively funny. If you don't use any type aliases, just the raw
types, you'll get suggested certain changes by the _clippy_. For example if you
consider the following piece of code
```rust
fn get_sum(nums: &Vec<i32>) -> i32 {
nums.iter().sum()
@ -842,6 +864,7 @@ fn main() {
```
and you run _clippy_ on it, you will get
```
Checking playground v0.0.1 (/playground)
warning: writing `&Vec` instead of `&[_]` involves a new object where a slice will do
@ -858,6 +881,7 @@ warning: `playground` (bin "playground") generated 1 warning
```
However, if you introduce a type alias, such as
```rust
type Numbers = Vec<i32>;
```
@ -865,5 +889,5 @@ type Numbers = Vec<i32>;
Then _clippy_ won't say anything, cause there is literally nothing to suggest.
However the outcome is not the same…
[_Advent of Code_]: https://adventofcode.com
[BFS above]: #day-12-hill-climbing-algorithm
[_advent of code_]: https://adventofcode.com
[bfs above]: #day-12-hill-climbing-algorithm

View file

@ -5,9 +5,9 @@ date: 2023-07-06T21:00
slug: aoc-2022/3rd-week
authors: mf
tags:
- advent-of-code
- advent-of-code-2022
- rust
- advent-of-code
- advent-of-code-2022
- rust
hide_table_of_contents: false
---
@ -51,6 +51,7 @@ underlying data structure.
Here you can see a rather short snippet from the solution that allows you to
“index” the graph:
```rust
impl Index<&str> for Graph {
type Output = Vertex;
@ -78,6 +79,7 @@ different ways the work can be split.
Being affected by _functional programming brain damage_:tm:, I have chosen to
do this part by function that returns an iterator over the possible ways:
```rust
fn pairings(
valves: &BTreeSet<String>,
@ -123,6 +125,7 @@ iterate through the positions that can actually collide with the wall or other
piece.
To get the desired behaviour, you can just compose few smaller functions:
```rust
fn occupied(shape: &[Vec<char>]) -> impl Iterator<Item = Position> + '_ {
shape.iter().enumerate().flat_map(|(y, row)| {
@ -140,12 +143,12 @@ fn occupied(shape: &[Vec<char>]) -> impl Iterator<Item = Position> + '_ {
In the end, we get relative positions which we can adjust later when given the
specific positions from iterator. You can see some interesting parts in this:
* `.enumerate()` allows us to get both the indices (coordinates) and the line
- `.enumerate()` allows us to get both the indices (coordinates) and the line
or, later on, the character itself,
* `.flat_map()` flattens the iterator, i.e. when we return another iterator,
- `.flat_map()` flattens the iterator, i.e. when we return another iterator,
they just get chained instead of iterating over iterators (which sounds pretty
disturbing, doesn't it?),
* and finally `.filter_map()` which is pretty similar to the “basic” `.map()`
- and finally `.filter_map()` which is pretty similar to the “basic” `.map()`
with a one, key, difference that it expects the items of an iterator to be
mapped to an `Option<T>` from which it ignores nothing (as in `None` :wink:)
and also unwraps the values from `Some(…)`.
@ -156,6 +159,7 @@ In the solution we cycle through both Tetris-like shapes that fall down and the
jets that move our pieces around. Initially I have implemented my own infinite
iterator that just yields the indices. It is a very simple, yet powerful, piece
of code:
```rust
struct InfiniteIndex {
size: usize,
@ -188,7 +192,7 @@ right away in the constructor of my structure and the iterators would borrow
both the jets and shapes which would introduce a lifetime dependency into the
structure.
## [Day 18: Boiling Boulders](https://adventofcode.com/2022/day/18)
## [Day 18: Boiling Boulders](https://adventofcode.com/2022/day/18)
:::info tl;dr
@ -253,6 +257,7 @@ a rather interesting issue with `.borrow_mut()` method being used on `Rc<RefCell
#### `.borrow_mut()`
Consider the following snippet of the code (taken from the documentation):
```rust
use std::cell::{RefCell, RefMut};
use std::collections::HashMap;
@ -289,6 +294,7 @@ It is a very primitive example for `Rc<RefCell<T>>` and mutable borrow.
If you uncomment the 4th line with `use std::borrow::BorrowMut;`, you cannot
compile the code anymore, because of
```
Compiling playground v0.0.1 (/playground)
error[E0308]: mismatched types
@ -349,8 +355,9 @@ method. OK, but how can we call it on the `Rc<T>`? Easily! `Rc<T>` implements
`T` objects. If we read on _`Deref` coercion_, we can see the following:
> If `T` implements `Deref<Target = U>`, …:
> * …
> * `T` implicitly implements all the (immutable) methods of the type `U`.
>
> - …
> - `T` implicitly implements all the (immutable) methods of the type `U`.
What is the requirement for the `.borrow_mut()` on `RefCell<T>`? Well, it needs
`&self`, so the `Deref` implements the `.borrow_mut()` for the `Rc<RefCell<T>>`.
@ -360,6 +367,7 @@ What is the requirement for the `.borrow_mut()` on `RefCell<T>`? Well, it needs
I have not been able to find a lot on this trait. My guess is that it provides a
method instead of a syntactic sugar (`&mut x`) for the mutable borrow. And also
it provides default implementations for the types:
```rust
impl BorrowMut<str> for String
@ -390,6 +398,7 @@ Now the question is why did it break the code… My first take was that the type
the `use` overrides it with the default, which is true **in a sense**. However
there is no _specialized_ implementation. Let's have a look at the trait and the
type signature on the `RefCell<T>`:
```rust
// trait
pub trait BorrowMut<Borrowed>: Borrow<Borrowed>
@ -474,6 +483,6 @@ left and right :smile:
[^2]: Pardon my example from the graph algorithms ;)
[^3]: [`Neg`](https://doc.rust-lang.org/std/ops/trait.Neg.html) docs
[_Advent of Code_]: https://adventofcode.com
[_advent of code_]: https://adventofcode.com
[`itertools`]: https://crates.io/crates/itertools
[this Reddit post and the comment]: https://www.reddit.com/r/adventofcode/comments/zb98pn/comment/iyq0ono
[this reddit post and the comment]: https://www.reddit.com/r/adventofcode/comments/zb98pn/comment/iyq0ono

View file

@ -5,9 +5,9 @@ date: 2023-07-07T15:14
slug: aoc-2022/4th-week
authors: mf
tags:
- advent-of-code
- advent-of-code-2022
- rust
- advent-of-code
- advent-of-code-2022
- rust
hide_table_of_contents: false
---
@ -46,6 +46,7 @@ rows (cause each row is a `Vec` element), but not for the columns, since they
span multiple rows.
For this use case I have implemented my own _column iterator_:
```rust
pub struct ColumnIterator<'a, T> {
map: &'a [Vec<T>],
@ -76,6 +77,7 @@ impl<'a, T> Iterator for ColumnIterator<'a, T> {
Given this piece of an iterator, it is very easy to factor out the common
functionality between the rows and columns into:
```rust
let mut find_boundaries = |constructor: fn(usize) -> Orientation,
iterator: &mut dyn Iterator<Item = &char>,
@ -92,6 +94,7 @@ let mut find_boundaries = |constructor: fn(usize) -> Orientation,
```
And then use it as such:
```rust
// construct all horizontal boundaries
(0..map.len()).for_each(|row| {
@ -119,6 +122,7 @@ And then use it as such:
Once the 2nd part got introduced, you start to think about a way how not to
copy-paste a lot of stuff (I haven't avoided it anyways…). In this problem, I've
chosen to introduce a trait (i.e. _interface_) for 2D and 3D walker.
```rust
trait Wrap: Clone {
type State;
@ -139,14 +143,16 @@ trait Wrap: Clone {
Each walker maintains its own state and also provides the functions that are
used during the simulation. The “promised” methods are separated into:
* _simulation_-related: that are used during the simulation from the `.fold()`
* _movement_-related: just a one method that holds most of the logic differences
- _simulation_-related: that are used during the simulation from the `.fold()`
- _movement_-related: just a one method that holds most of the logic differences
between 2D and 3D
* _final answer_: which extracts the _proof of solution_ from the
- _final answer_: which extracts the _proof of solution_ from the
implementation-specific walker
Both 2D and 3D versions borrow the original input and therefore you must
annotate the lifetime of it:
```rust
struct Wrap2D<'a> {
input: &'a Input,
@ -179,6 +185,7 @@ When writing the parsing for this problem, the first thing I have spotted on the
`char` was the `.is_digit()` function that takes a radix as a parameter. Clippy
noticed that I use `radix = 10` and suggested switching to `.is_ascii_digit()`
that does exactly the same thing:
```diff
- .take_while(|c| c.is_digit(10))
+ .take_while(|c| c.is_ascii_digit())
@ -189,6 +196,7 @@ to get the $n$-th element from it. You know the `.skip()`, you know the
`.next()`, just “slap” them together and we're done for :grin: Well, I got
suggested to use `.nth()` that does exactly the combination of the two mentioned
methods on iterators:
```diff
- match it.clone().skip(skip).next().unwrap() {
+ match it.clone().nth(skip).unwrap() {
@ -214,6 +222,7 @@ minimum that are, of course, exactly the same except for initial values and
comparators, it looks like a rather simple fix, but typing in Rust is something
else, right? In the end I settled for a function that computes both boundaries
without any duplication while using a closure:
```rust
fn get_bounds(positions: &Input) -> (Vector2D<isize>, Vector2D<isize>) {
let f = |init, cmp: &dyn Fn(isize, isize) -> isize| {
@ -234,9 +243,11 @@ bounding rectangle of all elves.
You might ask why would we need a closure and the answer is that `positions`
cannot be captured from within the nested function, only via closure. One more
fun fact on top of that is the type of the comparator
```rust
&dyn Fn(isize, isize) -> isize
```
Once we remove the `dyn` keyword, compiler yells at us and also includes a way
how to get a more thorough explanation of the error by running
@ -245,30 +256,30 @@ how to get a more thorough explanation of the error by running
which shows us
Trait objects must include the `dyn` keyword.
Erroneous code example:
```
trait Foo {}
fn test(arg: Box<Foo>) {} // error!
```
Trait objects are a way to call methods on types that are not known until
runtime but conform to some trait.
Trait objects should be formed with `Box<dyn Foo>`, but in the code above
`dyn` is left off.
This makes it harder to see that `arg` is a trait object and not a
simply a heap allocated type called `Foo`.
To fix this issue, add `dyn` before the trait name.
```
trait Foo {}
fn test(arg: Box<dyn Foo>) {} // ok!
```
This used to be allowed before edition 2021, but is now an error.
:::danger Rant
@ -277,8 +288,9 @@ Not all of the explanations are helpful though, in some cases they might be even
more confusing than helpful, since they address _very simple_ use cases.
As you can see, even in this case there are two sides to the explanations:
* it explains why you need to use `dyn`, but
* it still mentions that trait objects need to be heap-allocated via `Box<T>`
- it explains why you need to use `dyn`, but
- it still mentions that trait objects need to be heap-allocated via `Box<T>`
that, as you can see in my snippet, **does not** apply here :smile: IMO it's
caused by the fact that we are borrowing it and therefore we don't need to
care about the size or whereabouts of it.
@ -352,6 +364,7 @@ more verbose.
I'll skip the boring parts of checking bounds and entry/exit of the basin :wink:
We can easily calculate positions of the blizzards using a modular arithmetics:
```rust
impl Index<Position> for Basin {
type Output = char;
@ -428,12 +441,14 @@ that promotes vertices closer to the exit **and** with a minimum time taken.
:::
Cost function is, of course, a closure :wink:
```rust
let cost = |p: Position| p.z() as usize + exit.y().abs_diff(p.y()) + exit.x().abs_diff(p.x());
```
And also for checking the possible moves from the current vertex, I have
implemented, yet another, closure that yields an iterator with the next moves:
```rust
let next_positions = |p| {
[(0, 0, 1), (0, -1, 1), (0, 1, 1), (-1, 0, 1), (1, 0, 1)]
@ -461,6 +476,7 @@ popping the most prioritized elements yields values wrapped in the `Reverse`.
For this purpose I have just taken the max-heap and wrapped it as a whole in a
separate structure providing just the desired methods:
```rust
use std::cmp::{Ord, Reverse};
use std::collections::BinaryHeap;
@ -509,14 +525,16 @@ with a rather easy solution, as the last day always seems to be.
Implementing 2 functions, converting from the _SNAFU base_ and back to the _SNAFU_
_base_ representation. Let's do a bit more though! I have implemented two functions:
* `from_snafu`
* `to_snafu`
- `from_snafu`
- `to_snafu`
Now it is apparent that all I do is number to string and string to number. Hmm…
that sounds familiar, doesn't it? Let's introduce a structure for the SNAFU numbers
and implement the traits that we need.
Let's start with a structure:
```rust
#[derive(Debug, PartialEq, Eq, PartialOrd, Ord)]
struct SNAFU {
@ -550,6 +568,7 @@ After those changes we need to adjust the code and tests.
Parsing of the input is very easy, before we have used the lines, now we parse
everything:
```diff
fn parse_input<P: AsRef<Path>>(pathname: P) -> Input {
- file_to_lines(pathname)
@ -558,6 +577,7 @@ everything:
```
Part 1 needs to be adjusted a bit too:
```diff
fn part_1(input: &Input) -> Output {
- to_snafu(input.iter().map(|s| from_snafu(s)).sum())
@ -569,6 +589,7 @@ You can also see that it simplifies the meaning a bit and it is more explicit th
the previous versions.
And for the tests:
```diff
#[test]
fn test_from() {
@ -578,7 +599,7 @@ And for the tests:
+ assert_eq!(s.parse::<SNAFU>().unwrap().value, n);
}
}
#[test]
fn test_to() {
- for (n, s) in EXAMPLES.iter() {
@ -597,7 +618,7 @@ Let's wrap the whole thing up! Keeping in mind both AoC and the Rust…
### Advent of Code
This year was quite fun, even though most of the solutions and posts came in
later on (*cough* in '23 *cough*). Day 22 was the most obnoxious one… And also
later on (_cough_ in '23 _cough_). Day 22 was the most obnoxious one… And also
it feels like I used priority queues and tree data structures **a lot** :eyes:
### with Rust
@ -634,8 +655,8 @@ tl;dr Relatively pleasant language until you hit brick wall :wink:
See you next year! Maybe in Rust, maybe not :upside_down_face:
[_Advent of Code_]: https://adventofcode.com
[_A\*_]: https://en.wikipedia.org/wiki/A*_search_algorithm
[`BinaryHeap`]: https://doc.rust-lang.org/std/collections/struct.BinaryHeap.html
[`Reverse`]: https://doc.rust-lang.org/std/cmp/struct.Reverse.html
[docs of the `BinaryHeap`]: https://doc.rust-lang.org/std/collections/struct.BinaryHeap.html#min-heap
[_advent of code_]: https://adventofcode.com
[_a\*_]: https://en.wikipedia.org/wiki/A*_search_algorithm
[`binaryheap`]: https://doc.rust-lang.org/std/collections/struct.BinaryHeap.html
[`reverse`]: https://doc.rust-lang.org/std/cmp/struct.Reverse.html
[docs of the `binaryheap`]: https://doc.rust-lang.org/std/collections/struct.BinaryHeap.html#min-heap

View file

@ -5,9 +5,9 @@ date: 2023-03-04T23:15
slug: leetcode/sort-diagonally
authors: mf
tags:
- cpp
- leetcode
- iterators
- cpp
- leetcode
- iterators
hide_table_of_contents: false
---
@ -16,7 +16,7 @@ same time.
<!--truncate-->
* Link to the problem: https://leetcode.com/problems/sort-the-matrix-diagonally/
- Link to the problem: https://leetcode.com/problems/sort-the-matrix-diagonally/
## Problem description
@ -41,7 +41,7 @@ We are given the following skeleton for the C++ and the given challenge:
class Solution {
public:
vector<vector<int>> diagonalSort(vector<vector<int>>& mat) {
}
};
```
@ -131,6 +131,7 @@ advantage, given that you know how to “bend” the data structures accordingly
What does that mean for us? Well, we have an `std::sort`, we can use it, right?
Let's have a look at it:
```cpp
template< class RandomIt >
void sort( RandomIt first, RandomIt last );
@ -162,6 +163,7 @@ up, i.e. “_compiler-assisted development_”[^2] if you will ;)
Now we know that we can use `std::sort` to sort the diagonal itself, but we also
need to get the diagonals somehow. I'm rather lazy, so I'll just delegate it to
someone else[^3]. And that way we get
```cpp
matrix diagonalSort(matrix mat)
{
@ -179,6 +181,7 @@ matrix diagonalSort(matrix mat)
This solution looks very simple, doesn't it? Well, cause it is.
Let's try compiling it:
```
matrix-sort.cpp:11:23: error: use of undeclared identifier 'diagonals' [clang-diagnostic-error]
for (auto d : diagonals(mat)) {
@ -199,9 +202,10 @@ in our matrix. We use the _for-range_ loop, so whatever we get back from the
do such functionality for a matrix of any type, not just the `int` from the challenge.
As I said, we need to be able to
* construct the object
* get the beginning
* get the end (the “sentinel”)
- construct the object
- get the beginning
- get the end (the “sentinel”)
```cpp
template <typename T>
@ -295,10 +299,11 @@ public:
In this case we will be implementing a “simple” forward iterator, so we don't
need to implement a lot. Notably it will be:
* inequality operator (we need to know when we reach the end and have nothing to
- inequality operator (we need to know when we reach the end and have nothing to
iterate over)
* preincrementation operator (we need to be able to move around the iterable)
* dereference operator (we need to be able to retrieve the objects we iterate
- preincrementation operator (we need to be able to move around the iterable)
- dereference operator (we need to be able to retrieve the objects we iterate
over)
```cpp
@ -376,6 +381,7 @@ After implementing the iterator over diagonals, we know that all we need to desc
a diagonal is the matrix itself and the “start” of the diagonal (row and column).
And we also know that the diagonal must provide some iterators for the `std::sort`
function. We can start with the following skeleton:
```cpp
template <typename T>
class diagonal {
@ -434,12 +440,13 @@ steps.
Let's go through all of the functionality that our iterator needs to support to
be used in `std::sort`. We need the usual operations like:
* equality/inequality
* incrementation
* dereferencing
- equality/inequality
- incrementation
- dereferencing
We will also add all the types that our iterator uses with the category of the
iterator, i.e. what interface it supports:
```cpp
class diagonal_iter {
// we need to keep reference to the matrix itself
@ -486,13 +493,14 @@ public:
This is pretty similar to the previous iterator, but now we need to implement the
remaining requirements of the _random access iterator_. Let's see what those are:
* decrementation - cause we need to be able to move backwards too, since _random _
- decrementation - cause we need to be able to move backwards too, since _random _
_access iterator_ extends the interface of _bidirectional iterator_
* moving the iterator in either direction by steps given as an integer
* being able to tell the distance between two iterators
* define an ordering on the iterators
- moving the iterator in either direction by steps given as an integer
- being able to tell the distance between two iterators
- define an ordering on the iterators
Let's fill them in:
```cpp
class diagonal_iter {
// we need to keep reference to the matrix itself
@ -575,6 +583,7 @@ public:
At this point we could probably try and compile it, right? If we do so, we will
get yelled at by a compiler for the following reasons:
```
/usr/bin/../lib/gcc/x86_64-redhat-linux/12/../../../../include/c++/12/bits/stl_algo.h:1792:11: error: object of type 'diagonal<int>::diagonal_iter' cannot be assigned because its copy assignment operator is implicitly deleted [clang-diagnostic-error]
__last = __next;
@ -633,6 +642,7 @@ matrix-sort.cpp:17:19: note: copy assignment operator of 'diagonal_iter' is impl
```
That's a lot of noise, isn't it? Let's focus on the important parts:
```
/usr/bin/../lib/gcc/x86_64-redhat-linux/12/../../../../include/c++/12/bits/stl_algo.h:1792:11: error: object of type 'diagonal<int>::diagonal_iter' cannot be assigned because its copy assignment operator is implicitly deleted [clang-diagnostic-error]
@ -644,6 +654,7 @@ matrix-sort.cpp:17:19: note: copy assignment operator of 'diagonal_iter' is impl
Ah! We have a reference in our iterator, and this prevents us from having a copy
assignment operator (that is used “somewhere” in the sorting algorithm). Well…
Let's just wrap it!
```diff
# we need to keep a different type than reference
- matrix_t& m;
@ -667,4 +678,4 @@ We're done now! We have written an iterator over diagonals for a 2D `vector`. Yo
[^1]: just because I'm used to it and don't care about your opinion ;)
[^2]: exercise at your own risk
[^3]: me in 5 minutes in fact, but don't make me scared
[^4]: me in the next section…
[^4]: me in the next section…

View file

@ -131,7 +131,7 @@ const config = {
for (let mapping of fallbackMapping) {
if (existingPath.includes(`/${mapping.new}/`)) {
return mapping.old.map((old) =>
existingPath.replace(`/${mapping.new}/`, `/${old}/`)
existingPath.replace(`/${mapping.new}/`, `/${old}/`),
);
}
}

View file

@ -23,4 +23,4 @@ for relative_path in $(find ./static/files -name '.archive' -print); do
tar caf $base.tar.bz2 $all_files
cd $ROOT_DIR
done;
done;

View file

@ -1,22 +1,22 @@
import React from 'react';
import React from "react";
import clsx from 'clsx';
import clsx from "clsx";
import styles from './styles.module.css';
import styles from "./styles.module.css";
export default function ThemedSVG(props): JSX.Element {
const { source, className: parentClassName, alt, ...propsRest } = props;
const { source, className: parentClassName, alt, ...propsRest } = props;
return (
<>
<img
className={clsx("light-mode-only", parentClassName, styles.themed_svg)}
src={`${source}_light.svg`}
/>
<img
className={clsx("dark-mode-only", parentClassName, styles.themed_svg)}
src={`${source}_dark.svg`}
/>
</>
);
return (
<>
<img
className={clsx("light-mode-only", parentClassName, styles.themed_svg)}
src={`${source}_light.svg`}
/>
<img
className={clsx("dark-mode-only", parentClassName, styles.themed_svg)}
src={`${source}_dark.svg`}
/>
</>
);
}

View file

@ -1,3 +1,3 @@
.themed_svg {
max-width: inherit;
max-width: inherit;
}

View file

@ -49,4 +49,4 @@
[data-theme="dark"] & {
fill: var(--ifm-font-color-base);
}
}
}

View file

@ -54,4 +54,4 @@ const Contribution: FunctionComponent<ContributionMetadata> = ({
);
};
export default Contribution;
export default Contribution;

View file

@ -1,2 +1,2 @@
<!-- Source: https://remixicon.com/ -->
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24"><path fill="none" d="M0 0h24v24H0z"/><path d="M13 21v2.5l-3-2-3 2V21h-.5A3.5 3.5 0 0 1 3 17.5V5a3 3 0 0 1 3-3h14a1 1 0 0 1 1 1v17a1 1 0 0 1-1 1h-7zm0-2h6v-3H6.5a1.5 1.5 0 0 0 0 3H7v-2h6v2zm6-5V4H6v10.035A3.53 3.53 0 0 1 6.5 14H19zM7 5h2v2H7V5zm0 3h2v2H7V8zm0 3h2v2H7v-2z"/></svg>
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24"><path fill="none" d="M0 0h24v24H0z"/><path d="M13 21v2.5l-3-2-3 2V21h-.5A3.5 3.5 0 0 1 3 17.5V5a3 3 0 0 1 3-3h14a1 1 0 0 1 1 1v17a1 1 0 0 1-1 1h-7zm0-2h6v-3H6.5a1.5 1.5 0 0 0 0 3H7v-2h6v2zm6-5V4H6v10.035A3.53 3.53 0 0 1 6.5 14H19zM7 5h2v2H7V5zm0 3h2v2H7V8zm0 3h2v2H7v-2z"/></svg>

Before

Width:  |  Height:  |  Size: 379 B

After

Width:  |  Height:  |  Size: 380 B

View file

@ -49,4 +49,4 @@
[data-theme="dark"] & {
fill: var(--ifm-font-color-base);
}
}
}

View file

@ -135,4 +135,4 @@ function formatDateString(date: Date): string {
return `${date.getMonth() + 1}/${date.getUTCFullYear()}`;
}
export default Talk;
export default Talk;

View file

@ -1,2 +1,2 @@
<!-- Source: https://remixicon.com/ -->
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24"><path fill="none" d="M0 0h24v24H0z"/><path d="M17 3h4a1 1 0 0 1 1 1v16a1 1 0 0 1-1 1H3a1 1 0 0 1-1-1V4a1 1 0 0 1 1-1h4V1h2v2h6V1h2v2zm-2 2H9v2H7V5H4v4h16V5h-3v2h-2V5zm5 6H4v8h16v-8z"/></svg>
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24"><path fill="none" d="M0 0h24v24H0z"/><path d="M17 3h4a1 1 0 0 1 1 1v16a1 1 0 0 1-1 1H3a1 1 0 0 1-1-1V4a1 1 0 0 1 1-1h4V1h2v2h6V1h2v2zm-2 2H9v2H7V5H4v4h16V5h-3v2h-2V5zm5 6H4v8h16v-8z"/></svg>

Before

Width:  |  Height:  |  Size: 290 B

After

Width:  |  Height:  |  Size: 291 B

View file

@ -1,2 +1,2 @@
<!-- Source: https://remixicon.com/ -->
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24"><path fill="none" d="M0 0h24v24H0z"/><path d="M12 23.728l-6.364-6.364a9 9 0 1 1 12.728 0L12 23.728zm4.95-7.778a7 7 0 1 0-9.9 0L12 20.9l4.95-4.95zM12 13a2 2 0 1 1 0-4 2 2 0 0 1 0 4z"/></svg>
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24"><path fill="none" d="M0 0h24v24H0z"/><path d="M12 23.728l-6.364-6.364a9 9 0 1 1 12.728 0L12 23.728zm4.95-7.778a7 7 0 1 0-9.9 0L12 20.9l4.95-4.95zM12 13a2 2 0 1 1 0-4 2 2 0 0 1 0 4z"/></svg>

Before

Width:  |  Height:  |  Size: 289 B

After

Width:  |  Height:  |  Size: 290 B

View file

@ -1,2 +1,2 @@
<!-- Source: https://remixicon.com/ -->
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24"><path fill="none" d="M0 0h24v24H0z"/><path d="M6.455 19L2 22.5V4a1 1 0 0 1 1-1h18a1 1 0 0 1 1 1v14a1 1 0 0 1-1 1H6.455zm-.692-2H20V5H4v13.385L5.763 17zM11 10h2v2h-2v-2zm-4 0h2v2H7v-2zm8 0h2v2h-2v-2z"/></svg>
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24"><path fill="none" d="M0 0h24v24H0z"/><path d="M6.455 19L2 22.5V4a1 1 0 0 1 1-1h18a1 1 0 0 1 1 1v14a1 1 0 0 1-1 1H6.455zm-.692-2H20V5H4v13.385L5.763 17zM11 10h2v2h-2v-2zm-4 0h2v2H7v-2zm8 0h2v2h-2v-2z"/></svg>

Before

Width:  |  Height:  |  Size: 307 B

After

Width:  |  Height:  |  Size: 308 B

View file

@ -1,2 +1,2 @@
<!-- Source: https://remixicon.com/ -->
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24"><path fill="none" d="M0 0h24v24H0z"/><path d="M2 3.993A1 1 0 0 1 2.992 3h18.016c.548 0 .992.445.992.993v16.014a1 1 0 0 1-.992.993H2.992A.993.993 0 0 1 2 20.007V3.993zM8 5v14h8V5H8zM4 5v2h2V5H4zm14 0v2h2V5h-2zM4 9v2h2V9H4zm14 0v2h2V9h-2zM4 13v2h2v-2H4zm14 0v2h2v-2h-2zM4 17v2h2v-2H4zm14 0v2h2v-2h-2z"/></svg>
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24"><path fill="none" d="M0 0h24v24H0z"/><path d="M2 3.993A1 1 0 0 1 2.992 3h18.016c.548 0 .992.445.992.993v16.014a1 1 0 0 1-.992.993H2.992A.993.993 0 0 1 2 20.007V3.993zM8 5v14h8V5H8zM4 5v2h2V5H4zm14 0v2h2V5h-2zM4 9v2h2V9H4zm14 0v2h2V9h-2zM4 13v2h2v-2H4zm14 0v2h2v-2h-2zM4 17v2h2v-2H4zm14 0v2h2v-2h-2z"/></svg>

Before

Width:  |  Height:  |  Size: 407 B

After

Width:  |  Height:  |  Size: 408 B

View file

@ -1,2 +1,2 @@
<!-- Source: https://remixicon.com/ -->
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24"><path fill="none" d="M0 0h24v24H0z"/><path d="M13 21v2.5l-3-2-3 2V21h-.5A3.5 3.5 0 0 1 3 17.5V5a3 3 0 0 1 3-3h14a1 1 0 0 1 1 1v17a1 1 0 0 1-1 1h-7zm0-2h6v-3H6.5a1.5 1.5 0 0 0 0 3H7v-2h6v2zm6-5V4H6v10.035A3.53 3.53 0 0 1 6.5 14H19zM7 5h2v2H7V5zm0 3h2v2H7V8zm0 3h2v2H7v-2z"/></svg>
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24"><path fill="none" d="M0 0h24v24H0z"/><path d="M13 21v2.5l-3-2-3 2V21h-.5A3.5 3.5 0 0 1 3 17.5V5a3 3 0 0 1 3-3h14a1 1 0 0 1 1 1v17a1 1 0 0 1-1 1h-7zm0-2h6v-3H6.5a1.5 1.5 0 0 0 0 3H7v-2h6v2zm6-5V4H6v10.035A3.53 3.53 0 0 1 6.5 14H19zM7 5h2v2H7V5zm0 3h2v2H7V8zm0 3h2v2H7v-2z"/></svg>

Before

Width:  |  Height:  |  Size: 379 B

After

Width:  |  Height:  |  Size: 380 B

View file

@ -1,2 +1,2 @@
<!-- Source: https://remixicon.com/ -->
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24"><path fill="none" d="M0 0h24v24H0z"/><path d="M13 18v2h4v2H7v-2h4v-2H3a1 1 0 0 1-1-1V4a1 1 0 0 1 1-1h18a1 1 0 0 1 1 1v13a1 1 0 0 1-1 1h-8zM4 5v11h16V5H4zm6 2.5l5 3-5 3v-6z"/></svg>
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24"><path fill="none" d="M0 0h24v24H0z"/><path d="M13 18v2h4v2H7v-2h4v-2H3a1 1 0 0 1-1-1V4a1 1 0 0 1 1-1h18a1 1 0 0 1 1 1v13a1 1 0 0 1-1 1h-8zM4 5v11h16V5H4zm6 2.5l5 3-5 3v-6z"/></svg>

Before

Width:  |  Height:  |  Size: 280 B

After

Width:  |  Height:  |  Size: 281 B

View file

@ -1,19 +1,19 @@
@font-face {
font-family: "Cascadia Code PL";
src: url("/fonts/CascadiaCodePL.woff2") format("woff2");
font-family: "Cascadia Code PL";
src: url("/fonts/CascadiaCodePL.woff2") format("woff2");
}
@font-face {
font-family: "Cascadia Code PL";
font-style: italic;
src: url("/fonts/CascadiaCodePLItalic.woff2") format("woff2");
font-family: "Cascadia Code PL";
font-style: italic;
src: url("/fonts/CascadiaCodePLItalic.woff2") format("woff2");
}
[data-theme='dark'] pre,
[data-theme='dark'] code,
[data-theme='dark'] kbd,
[data-theme='dark'] var,
[data-theme='dark'] tt,
[data-theme='dark'] text {
font-weight: 350;
[data-theme="dark"] pre,
[data-theme="dark"] code,
[data-theme="dark"] kbd,
[data-theme="dark"] var,
[data-theme="dark"] tt,
[data-theme="dark"] text {
font-weight: 350;
}

View file

@ -16,7 +16,7 @@
}
/* For readability concerns, you should choose a lighter palette in dark mode. */
[data-theme='dark'] {
[data-theme="dark"] {
--ifm-color-primary: #1d9bf0;
--ifm-color-primary-dark: #0f8ee3;
--ifm-color-primary-darker: #0e86d6;
@ -32,11 +32,12 @@ kbd,
var,
tt,
text {
font-family: 'Cascadia Code PL', 'JetBrains Mono', 'Iosevka', 'Fira Code', 'Hack', monospace;
font-family: "Cascadia Code PL", "JetBrains Mono", "Iosevka", "Fira Code",
"Hack", monospace;
}
[data-theme='light'] img[src$='#gh-dark-mode-only'],
[data-theme='dark'] img[src$='#gh-light-mode-only'],
[data-theme="light"] img[src$="#gh-dark-mode-only"],
[data-theme="dark"] img[src$="#gh-light-mode-only"],
[data-theme="light"] .dark-mode-only,
[data-theme="dark"] .light-mode-only {
display: none;

View file

@ -1,20 +1,19 @@
@font-face {
font-family: "JetBrains Mono";
src: url("/fonts/JetBrainsMono[wght].woff2") format("woff2");
font-family: "JetBrains Mono";
src: url("/fonts/JetBrainsMono[wght].woff2") format("woff2");
}
@font-face {
font-family: "JetBrains Mono";
font-style: italic;
src: url("/fonts/JetBrainsMono-Italic[wght].woff2") format("woff2");
font-family: "JetBrains Mono";
font-style: italic;
src: url("/fonts/JetBrainsMono-Italic[wght].woff2") format("woff2");
}
[data-theme='dark'] pre,
[data-theme='dark'] code,
[data-theme='dark'] kbd,
[data-theme='dark'] var,
[data-theme='dark'] tt,
[data-theme='dark'] text {
font-weight: 350;
[data-theme="dark"] pre,
[data-theme="dark"] code,
[data-theme="dark"] kbd,
[data-theme="dark"] var,
[data-theme="dark"] tt,
[data-theme="dark"] text {
font-weight: 350;
}

View file

@ -4,4 +4,4 @@ $breakpoint: 996px;
@media (max-width: $breakpoint) {
@content;
}
}
}

View file

@ -1,221 +1,270 @@
import React from "react";
import Layout from "@theme/Layout";
import Contribution, { ContributionMetadata } from "../components/contributions/Contribution";
import Contribution, {
ContributionMetadata,
} from "../components/contributions/Contribution";
const contributions: ContributionMetadata[] = [
{
title: "tmt",
description: (<p>
{
title: "tmt",
description: (
<p>
The `tmt` tool provides a user-friendly way to work with tests. You can
comfortably create new tests, safely and easily run tests across different
environments, review test results, debug test code and enable tests in the
CI using a consistent and concise config.
</p>),
contribution: (<p>
Just a smallish contribution to the docs related to the changes implemented
on the Packit side.
</p>),
repoURL: "https://github.com/teemtee/tmt",
},
{
title: "Fedora Infrastructure Ansible",
description: (<p>
Collection of Ansible playbooks that powers the Fedora Infrastructure.
</p>),
contribution: (<p>
I have adjusted the groups in the Bodhi playbooks after Packit has
been granted the privileges to propose updates without restrictions.
</p>),
repoURL: "https://pagure.io/fedora-infra/ansible",
},
{
title: "Bodhi",
description: (<p>
Bodhi is a web-system that facilitates the process of publishing
updates for a Fedora-based software distribution.
</p>),
contribution: (<p>
I have adjusted the client, so that it doesn't show secrets in
terminal when you log in to the Bodhi via browser.
</p>),
repoURL: "https://github.com/fedora-infra/bodhi",
},
{
title: "Gluetool Modules Collection",
description: (<p>
Modules for <code>gluetool</code> a command line centric framework
usable for glueing modules into a pipeline.
</p>),
contribution: (<ul>
<li>
I have proposed a possible implementation of git merging that
was later on extended.
</li>
<li>
I have tried to help out with Copr module after they deprecated
older version of their API.
</li>
</ul>),
repoURL: "https://gitlab.com/testing-farm/gluetool-modules",
},
{
title: "Pagure",
description: (<p>
Pagure is a git-centered forge, python based using pygit2.
</p>),
contribution: (<p>
I have added an API endpoint for reopening pull requests.
</p>),
repoURL: "https://pagure.io/pagure",
},
{
title: "Copr",
description: (<p>
RPM build system - upstream for{" "}
<a target="_blank" href="https://copr.fedorainfracloud.org/">Copr</a>.
</p>),
contribution: (<ul>
<li>
Supporting external repositories for custom SRPM build method.
</li>
<li>
Allowing admins of Copr repositories to build without the need
to ask for explicit <code>builder</code> permissions.
</li>
</ul>),
repoURL: "https://github.com/fedora-copr/copr",
},
{
title: "python-gitlab",
description: (<p>
A python wrapper for the GitLab API.
</p>),
contribution: (<p>
I have contributed support for the <code>merge_ref</code> on merge
requests that hasn't been supported, yet it was present in the GitLab
API.
</p>),
repoURL: "https://github.com/python-gitlab/python-gitlab",
},
{
title: "PatternFly React",
description: (<p>
A set of React components for the PatternFly project.
</p>),
contribution: (<p>
When working on Packit Dashboard, I have spotted smaller bugs that
were present in this project and fixed them upstream to provide
better experience for our users.
</p>),
repoURL: "https://github.com/patternfly/patternfly-react",
},
{
title: "Fira Code",
description: (<p>
Free monospaced font with programming ligatures.
</p>),
contribution: (<p>
I have set up a GitHub Action for building the font on each push to
the default branch allowing users to install <i>bleeding edge</i>{" "}
version of the font.
</p>),
repoURL: "https://github.com/tonsky/FiraCode",
},
{
title: "nixpkgs",
description: (<p>
Nixpkgs is a collection of over 80,000 software packages that can be
installed with the Nix package manager. It also implements NixOS,
a purely-functional Linux distribution.
</p>),
contribution: (<p>
When I was trying out the nixpkgs, I have tried to bump .NET Core to
the latest version. My changes haven't been accepted as they required
bumping of multiple more packages that depended upon the .NET Core.
</p>),
repoURL: "https://github.com/NixOS/nixpkgs",
},
{
title: "Darcula",
description: (<p>
A theme for Visual Studio Code based on Darcula theme from Jetbrains
IDEs.
</p>),
contribution: (<p>
I have contributed support for diff files, though the project doesn't
seem to be live anymore, so it hasn't been accepted as of now.
</p>),
repoURL: "https://github.com/rokoroku/vscode-theme-darcula",
},
{
title: "Packit",
description: (<p>
An open source project aiming to ease the integration of your
project with Fedora Linux, CentOS Stream and other distributions.
</p>),
contribution: (<p>
Have a look at my{" "}
<a
href="https://github.com/search?q=is%3Apr%20author%3Amfocko%20org%3Apackit&type=pullrequests"
target="_blank">
pull requests
</a>.
</p>),
repoURL: "https://github.com/packit",
},
{
title: "Snitch",
description: (<>
<p>
Language agnostic tool that collects TODOs in the source code
and reports them as Issues.
</p>
</>),
contribution: (<ul>
<li>Environment variable support for self-hosted GitLab instances</li>
<li>GitLab support</li>
</ul>),
repoURL: "https://github.com/tsoding/snitch",
},
{
title: "Karel the Robot",
description: (<>
<p>
Karel the robot is in general an educational programming
language for beginners, created by <i>Richard E. Pattis</i>.
This is implementation of <i>Karel the Robot</i> for{" "}
<i>C programming language</i>.
</p>
<p>
This project is used for educational purposes at{" "}
<a target="_blank" href="https://fei.tuke.sk">TUKE</a>.
</p>
</>),
contribution: (<p>
I have contributed some refactoring tips to the author of the
library.
</p>),
repoURL: "https://git.kpi.fei.tuke.sk/kpi/karel-the-robot",
},
comfortably create new tests, safely and easily run tests across
different environments, review test results, debug test code and enable
tests in the CI using a consistent and concise config.
</p>
),
contribution: (
<p>
Just a smallish contribution to the docs related to the changes
implemented on the Packit side.
</p>
),
repoURL: "https://github.com/teemtee/tmt",
},
{
title: "Fedora Infrastructure Ansible",
description: (
<p>
Collection of Ansible playbooks that powers the Fedora Infrastructure.
</p>
),
contribution: (
<p>
I have adjusted the groups in the Bodhi playbooks after Packit has been
granted the privileges to propose updates without restrictions.
</p>
),
repoURL: "https://pagure.io/fedora-infra/ansible",
},
{
title: "Bodhi",
description: (
<p>
Bodhi is a web-system that facilitates the process of publishing updates
for a Fedora-based software distribution.
</p>
),
contribution: (
<p>
I have adjusted the client, so that it doesn't show secrets in terminal
when you log in to the Bodhi via browser.
</p>
),
repoURL: "https://github.com/fedora-infra/bodhi",
},
{
title: "Gluetool Modules Collection",
description: (
<p>
Modules for <code>gluetool</code> a command line centric framework
usable for glueing modules into a pipeline.
</p>
),
contribution: (
<ul>
<li>
I have proposed a possible implementation of git merging that was
later on extended.
</li>
<li>
I have tried to help out with Copr module after they deprecated older
version of their API.
</li>
</ul>
),
repoURL: "https://gitlab.com/testing-farm/gluetool-modules",
},
{
title: "Pagure",
description: (
<p>Pagure is a git-centered forge, python based using pygit2.</p>
),
contribution: (
<p>I have added an API endpoint for reopening pull requests.</p>
),
repoURL: "https://pagure.io/pagure",
},
{
title: "Copr",
description: (
<p>
RPM build system - upstream for{" "}
<a target="_blank" href="https://copr.fedorainfracloud.org/">
Copr
</a>
.
</p>
),
contribution: (
<ul>
<li>Supporting external repositories for custom SRPM build method.</li>
<li>
Allowing admins of Copr repositories to build without the need to ask
for explicit <code>builder</code> permissions.
</li>
</ul>
),
repoURL: "https://github.com/fedora-copr/copr",
},
{
title: "python-gitlab",
description: <p>A python wrapper for the GitLab API.</p>,
contribution: (
<p>
I have contributed support for the <code>merge_ref</code> on merge
requests that hasn't been supported, yet it was present in the GitLab
API.
</p>
),
repoURL: "https://github.com/python-gitlab/python-gitlab",
},
{
title: "PatternFly React",
description: <p>A set of React components for the PatternFly project.</p>,
contribution: (
<p>
When working on Packit Dashboard, I have spotted smaller bugs that were
present in this project and fixed them upstream to provide better
experience for our users.
</p>
),
repoURL: "https://github.com/patternfly/patternfly-react",
},
{
title: "Fira Code",
description: <p>Free monospaced font with programming ligatures.</p>,
contribution: (
<p>
I have set up a GitHub Action for building the font on each push to the
default branch allowing users to install <i>bleeding edge</i> version of
the font.
</p>
),
repoURL: "https://github.com/tonsky/FiraCode",
},
{
title: "nixpkgs",
description: (
<p>
Nixpkgs is a collection of over 80,000 software packages that can be
installed with the Nix package manager. It also implements NixOS, a
purely-functional Linux distribution.
</p>
),
contribution: (
<p>
When I was trying out the nixpkgs, I have tried to bump .NET Core to the
latest version. My changes haven't been accepted as they required
bumping of multiple more packages that depended upon the .NET Core.
</p>
),
repoURL: "https://github.com/NixOS/nixpkgs",
},
{
title: "Darcula",
description: (
<p>
A theme for Visual Studio Code based on Darcula theme from Jetbrains
IDEs.
</p>
),
contribution: (
<p>
I have contributed support for diff files, though the project doesn't
seem to be live anymore, so it hasn't been accepted as of now.
</p>
),
repoURL: "https://github.com/rokoroku/vscode-theme-darcula",
},
{
title: "Packit",
description: (
<p>
An open source project aiming to ease the integration of your project
with Fedora Linux, CentOS Stream and other distributions.
</p>
),
contribution: (
<p>
Have a look at my{" "}
<a
href="https://github.com/search?q=is%3Apr%20author%3Amfocko%20org%3Apackit&type=pullrequests"
target="_blank"
>
pull requests
</a>
.
</p>
),
repoURL: "https://github.com/packit",
},
{
title: "Snitch",
description: (
<>
<p>
Language agnostic tool that collects TODOs in the source code and
reports them as Issues.
</p>
</>
),
contribution: (
<ul>
<li>Environment variable support for self-hosted GitLab instances</li>
<li>GitLab support</li>
</ul>
),
repoURL: "https://github.com/tsoding/snitch",
},
{
title: "Karel the Robot",
description: (
<>
<p>
Karel the robot is in general an educational programming language for
beginners, created by <i>Richard E. Pattis</i>. This is implementation
of <i>Karel the Robot</i> for <i>C programming language</i>.
</p>
<p>
This project is used for educational purposes at{" "}
<a target="_blank" href="https://fei.tuke.sk">
TUKE
</a>
.
</p>
</>
),
contribution: (
<p>
I have contributed some refactoring tips to the author of the library.
</p>
),
repoURL: "https://git.kpi.fei.tuke.sk/kpi/karel-the-robot",
},
];
const title = "Contributions";
const description = "Many of my contributions to open-source projects.";
export default function Contributions(): JSX.Element {
return (
<Layout title={title} description={description}>
<main className="container container--fluid margin-vert--lg">
<h1>{title}</h1>
<p>{description}</p>
return (
<Layout title={title} description={description}>
<main className="container container--fluid margin-vert--lg">
<h1>{title}</h1>
<p>{description}</p>
<div className="row">
{contributions.map((contributionData) => (
<Contribution key={contributionData.project} {...contributionData} />
))}
</div>
</main>
</Layout>
);
<div className="row">
{contributions.map((contributionData) => (
<Contribution
key={contributionData.project}
{...contributionData}
/>
))}
</div>
</main>
</Layout>
);
}

View file

@ -3,93 +3,68 @@
// import type { PrismTheme } from '../types'
var theme/* : PrismTheme */ = {
"plain": {
"color": "#657b83",
"backgroundColor": "#fdf6e3"
var theme: PrismTheme = {
plain: {
color: "#657b83",
backgroundColor: "#fdf6e3",
},
"styles": [
styles: [
{
"types": [
"comment"
],
"style": {
"color": "rgb(147, 161, 161)",
"fontStyle": "italic"
}
types: ["comment"],
style: {
color: "rgb(147, 161, 161)",
fontStyle: "italic",
},
},
{
"types": [
"string"
],
"style": {
"color": "rgb(42, 161, 152)"
}
types: ["string"],
style: {
color: "rgb(42, 161, 152)",
},
},
{
"types": [
"number"
],
"style": {
"color": "rgb(211, 54, 130)"
}
types: ["number"],
style: {
color: "rgb(211, 54, 130)",
},
},
{
"types": [
"variable",
"function",
"tag"
],
"style": {
"color": "rgb(38, 139, 210)"
}
types: ["variable", "function", "tag"],
style: {
color: "rgb(38, 139, 210)",
},
},
{
"types": [
"class-name",
"keyword",
"char",
"constant",
"changed"
],
"style": {
"color": "rgb(203, 75, 22)"
}
types: ["class-name", "keyword", "char", "constant", "changed"],
style: {
color: "rgb(203, 75, 22)",
},
},
{
"types": [
"punctuation",
"inserted"
],
"style": {
"color": "rgb(133, 153, 0)"
}
types: ["punctuation", "inserted"],
style: {
color: "rgb(133, 153, 0)",
},
},
{
"types": [
"builtin"
],
"style": {
"color": "rgb(181, 137, 0)"
}
types: ["builtin"],
style: {
color: "rgb(181, 137, 0)",
},
},
{
"types": [
"attr-name"
],
"style": {
"color": "rgb(147, 161, 161)"
}
types: ["attr-name"],
style: {
color: "rgb(147, 161, 161)",
},
},
{
"types": [
"deleted"
],
"style": {
"color": "rgb(220, 50, 47)"
}
}
]
types: ["deleted"],
style: {
color: "rgb(220, 50, 47)",
},
},
],
};
module.exports = theme;

View file

@ -13,4 +13,4 @@ format:
tidy:
clang-tidy *.cpp -- $(CXXFLAGS)
.PHONY: matrix-sort format tidy
.PHONY: matrix-sort format tidy

View file

@ -226,4 +226,4 @@ int main()
test_case_2();
return 0;
}
}