diff --git a/404.html b/404.html index d56d25c..3c5e2e7 100644 --- a/404.html +++ b/404.html @@ -14,8 +14,8 @@ - - + +
Skip to main content

Page Not Found

We could not find what you were looking for.

Please contact the owner of the site that linked you to the original URL and let them know their link is broken.

diff --git a/algorithms/algorithms-correctness/postcondition-ambiguity/index.html b/algorithms/algorithms-correctness/postcondition-ambiguity/index.html index 9b787a4..9337596 100644 --- a/algorithms/algorithms-correctness/postcondition-ambiguity/index.html +++ b/algorithms/algorithms-correctness/postcondition-ambiguity/index.html @@ -16,8 +16,8 @@ - - + +
Skip to main content

Vague postconditions and proving correctness of algorithms

Introduction

diff --git a/algorithms/category/algorithms-and-correctness/index.html b/algorithms/category/algorithms-and-correctness/index.html index 0bc8c93..e543a7b 100644 --- a/algorithms/category/algorithms-and-correctness/index.html +++ b/algorithms/category/algorithms-and-correctness/index.html @@ -18,8 +18,8 @@ correctness. - - + +

Algorithms and Correctness

Materials related to basic ideas behind algorithms and proofs of their diff --git a/algorithms/category/asymptotic-notation-and-time-complexity/index.html b/algorithms/category/asymptotic-notation-and-time-complexity/index.html index be991b0..4efcdad 100644 --- a/algorithms/category/asymptotic-notation-and-time-complexity/index.html +++ b/algorithms/category/asymptotic-notation-and-time-complexity/index.html @@ -16,8 +16,8 @@ - - + +

Asymptotic Notation and Time Complexity

Materials related to asymptotic notation and time complexity. diff --git a/algorithms/category/graphs/index.html b/algorithms/category/graphs/index.html index b88436a..e0ea94f 100644 --- a/algorithms/category/graphs/index.html +++ b/algorithms/category/graphs/index.html @@ -16,8 +16,8 @@ - - + +

Graphs

Materials related to basic graph algorithms and graph problems. diff --git a/algorithms/category/hash-tables/index.html b/algorithms/category/hash-tables/index.html index 42888d7..afdf0ac 100644 --- a/algorithms/category/hash-tables/index.html +++ b/algorithms/category/hash-tables/index.html @@ -16,8 +16,8 @@ - - + +

Hash Tables

Materials related to hash tables. diff --git a/algorithms/category/paths-in-graphs/index.html b/algorithms/category/paths-in-graphs/index.html index 6bf3bd8..ee3e48d 100644 --- a/algorithms/category/paths-in-graphs/index.html +++ b/algorithms/category/paths-in-graphs/index.html @@ -16,8 +16,8 @@ - - + +

Paths in Graphs

Materials related to finding paths in graphs. diff --git a/algorithms/category/recursion/index.html b/algorithms/category/recursion/index.html index 6343812..c4ab50b 100644 --- a/algorithms/category/recursion/index.html +++ b/algorithms/category/recursion/index.html @@ -16,8 +16,8 @@ - - + +

Recursion

Materials related to recursive algorithms and their time complexity. diff --git a/algorithms/category/red-black-trees/index.html b/algorithms/category/red-black-trees/index.html index affad2e..178d71b 100644 --- a/algorithms/category/red-black-trees/index.html +++ b/algorithms/category/red-black-trees/index.html @@ -16,8 +16,8 @@ - - + +

Red-Black Trees

Materials related to red-black trees. diff --git a/algorithms/graphs/bfs-tree/index.html b/algorithms/graphs/bfs-tree/index.html index 805bbe1..799e504 100644 --- a/algorithms/graphs/bfs-tree/index.html +++ b/algorithms/graphs/bfs-tree/index.html @@ -16,8 +16,8 @@ - - + +

Distance boundaries from BFS tree on undirected graphs

Introduction

diff --git a/algorithms/graphs/iterative-and-iterators/index.html b/algorithms/graphs/iterative-and-iterators/index.html index 9a4c140..de575ed 100644 --- a/algorithms/graphs/iterative-and-iterators/index.html +++ b/algorithms/graphs/iterative-and-iterators/index.html @@ -16,8 +16,8 @@ - - + +

Iterative algorithms via iterators

Introduction

diff --git a/algorithms/hash-tables/breaking/index.html b/algorithms/hash-tables/breaking/index.html index e6b5bf3..3e8e50f 100644 --- a/algorithms/hash-tables/breaking/index.html +++ b/algorithms/hash-tables/breaking/index.html @@ -16,8 +16,8 @@ - - + +

Breaking hash table

We will try to break a hash table and discuss possible ways how to prevent such diff --git a/algorithms/hash-tables/breaking/mitigations/index.html b/algorithms/hash-tables/breaking/mitigations/index.html index d926898..3a63f77 100644 --- a/algorithms/hash-tables/breaking/mitigations/index.html +++ b/algorithms/hash-tables/breaking/mitigations/index.html @@ -16,8 +16,8 @@ - - + +

Possible Mitigations

There are multiple ways the issues created above can be mitigated. Still we can diff --git a/algorithms/hash-tables/breaking/python/index.html b/algorithms/hash-tables/breaking/python/index.html index e6ab778..e91e891 100644 --- a/algorithms/hash-tables/breaking/python/index.html +++ b/algorithms/hash-tables/breaking/python/index.html @@ -16,8 +16,8 @@ - - + +

Breaking the Hash Table in Python

diff --git a/algorithms/index.html b/algorithms/index.html index b0b4ef5..0edfb2e 100644 --- a/algorithms/index.html +++ b/algorithms/index.html @@ -14,8 +14,8 @@ - - + +

Introduction

In this part you can find “random” additional materials I have written over the @@ -23,6 +23,6 @@ course of teaching Algorithms and data structures I.

It is a various mix of stuff that may have been produced as a follow-up on some question asked at the seminar or spontanously.

If you have some ideas for posts, please do not hesitate to submit them as issues -in the linked GitLab.

+in the linked GitLab.

\ No newline at end of file diff --git a/algorithms/paths/bf-to-astar/astar/index.html b/algorithms/paths/bf-to-astar/astar/index.html index a9336d2..19dae16 100644 --- a/algorithms/paths/bf-to-astar/astar/index.html +++ b/algorithms/paths/bf-to-astar/astar/index.html @@ -16,8 +16,8 @@ - - + +

A* algorithm

Intro

diff --git a/algorithms/paths/bf-to-astar/bf/index.html b/algorithms/paths/bf-to-astar/bf/index.html index 0326d72..ced7bce 100644 --- a/algorithms/paths/bf-to-astar/bf/index.html +++ b/algorithms/paths/bf-to-astar/bf/index.html @@ -18,8 +18,8 @@ something. - - + +

BF

Basic idea

diff --git a/algorithms/paths/bf-to-astar/dijkstra/index.html b/algorithms/paths/bf-to-astar/dijkstra/index.html index 1af0984..67082b0 100644 --- a/algorithms/paths/bf-to-astar/dijkstra/index.html +++ b/algorithms/paths/bf-to-astar/dijkstra/index.html @@ -16,8 +16,8 @@ - - + +

Dijkstra's algorithm

Intro

diff --git a/algorithms/paths/bf-to-astar/index.html b/algorithms/paths/bf-to-astar/index.html index 551120b..bce689b 100644 --- a/algorithms/paths/bf-to-astar/index.html +++ b/algorithms/paths/bf-to-astar/index.html @@ -16,8 +16,8 @@ - - + +

From BF to A*

Intro

diff --git a/algorithms/rb-trees/applications/index.html b/algorithms/rb-trees/applications/index.html index 3c05b50..f8cac0e 100644 --- a/algorithms/rb-trees/applications/index.html +++ b/algorithms/rb-trees/applications/index.html @@ -16,8 +16,8 @@ - - + +

Použití červeno-černých stromů

Použití

diff --git a/algorithms/rb-trees/rules/index.html b/algorithms/rb-trees/rules/index.html index 121db5d..c673760 100644 --- a/algorithms/rb-trees/rules/index.html +++ b/algorithms/rb-trees/rules/index.html @@ -16,8 +16,8 @@ - - + +

On the rules of the red-black tree

Introduction

diff --git a/algorithms/recursion/karel/index.html b/algorithms/recursion/karel/index.html index 2c7cb4d..3c5e554 100644 --- a/algorithms/recursion/karel/index.html +++ b/algorithms/recursion/karel/index.html @@ -16,8 +16,8 @@ - - + +

Recursion and backtracking with Robot Karel

    diff --git a/algorithms/recursion/karel/solution/index.html b/algorithms/recursion/karel/solution/index.html index 617da11..82e59d3 100644 --- a/algorithms/recursion/karel/solution/index.html +++ b/algorithms/recursion/karel/solution/index.html @@ -16,8 +16,8 @@ - - + +

    Solving the maze problem

    diff --git a/algorithms/recursion/pyramid-slide-down/bottom-up-dp/index.html b/algorithms/recursion/pyramid-slide-down/bottom-up-dp/index.html index 32003ab..eeb6293 100644 --- a/algorithms/recursion/pyramid-slide-down/bottom-up-dp/index.html +++ b/algorithms/recursion/pyramid-slide-down/bottom-up-dp/index.html @@ -16,8 +16,8 @@ - - + +

    Bottom-up dynamic programming

    diff --git a/algorithms/recursion/pyramid-slide-down/greedy/index.html b/algorithms/recursion/pyramid-slide-down/greedy/index.html index 81befa4..d786643 100644 --- a/algorithms/recursion/pyramid-slide-down/greedy/index.html +++ b/algorithms/recursion/pyramid-slide-down/greedy/index.html @@ -16,8 +16,8 @@ - - + +

    Greedy solution

    We will try to optimize it a bit. Let's start with a relatively simple greedy diff --git a/algorithms/recursion/pyramid-slide-down/index.html b/algorithms/recursion/pyramid-slide-down/index.html index c5095c8..d1c7e0b 100644 --- a/algorithms/recursion/pyramid-slide-down/index.html +++ b/algorithms/recursion/pyramid-slide-down/index.html @@ -16,8 +16,8 @@ - - + +

    Introduction to dynamic programming

    In this series we will try to solve one problem in different ways.

    diff --git a/algorithms/recursion/pyramid-slide-down/naive/index.html b/algorithms/recursion/pyramid-slide-down/naive/index.html index c3fead4..8daf9f4 100644 --- a/algorithms/recursion/pyramid-slide-down/naive/index.html +++ b/algorithms/recursion/pyramid-slide-down/naive/index.html @@ -16,8 +16,8 @@ - - + +

    Naïve solution

    Our naïve solution consists of trying out all the possible slides and finding diff --git a/algorithms/recursion/pyramid-slide-down/top-down-dp/index.html b/algorithms/recursion/pyramid-slide-down/top-down-dp/index.html index f83bb8e..dc36a30 100644 --- a/algorithms/recursion/pyramid-slide-down/top-down-dp/index.html +++ b/algorithms/recursion/pyramid-slide-down/top-down-dp/index.html @@ -16,8 +16,8 @@ - - + +

    Top-down dynamic programming

    diff --git a/algorithms/tags/a-star/index.html b/algorithms/tags/a-star/index.html index d738775..836aea9 100644 --- a/algorithms/tags/a-star/index.html +++ b/algorithms/tags/a-star/index.html @@ -14,8 +14,8 @@ - - + +

    One doc tagged with "a star"

    View All Tags

    From BF to A*

    Figuring out shortest-path problem from the BF to the A* algorithm. diff --git a/algorithms/tags/applications/index.html b/algorithms/tags/applications/index.html index e013458..b293210 100644 --- a/algorithms/tags/applications/index.html +++ b/algorithms/tags/applications/index.html @@ -14,8 +14,8 @@ - - + +

    One doc tagged with "applications"

    View All Tags

    Použití červeno-černých stromů

    Ukázka použití červeno-černých stromů v standardních knižnicích známých jazyků. diff --git a/algorithms/tags/astar/index.html b/algorithms/tags/astar/index.html index 2d5b797..57fabbd 100644 --- a/algorithms/tags/astar/index.html +++ b/algorithms/tags/astar/index.html @@ -14,8 +14,8 @@ - - + +

    One doc tagged with "astar"

    View All Tags

    A* algorithm

    Moving from Dijkstra's algorithm into the A* algorithm. diff --git a/algorithms/tags/backtracking/index.html b/algorithms/tags/backtracking/index.html index ae9d441..70b9d0a 100644 --- a/algorithms/tags/backtracking/index.html +++ b/algorithms/tags/backtracking/index.html @@ -14,8 +14,8 @@ - - + +

    2 docs tagged with "backtracking"

    View All Tags

    Recursion and backtracking with Robot Karel

    A problem with too many restrictions. diff --git a/algorithms/tags/balanced-trees/index.html b/algorithms/tags/balanced-trees/index.html index 59d6c8f..557aaee 100644 --- a/algorithms/tags/balanced-trees/index.html +++ b/algorithms/tags/balanced-trees/index.html @@ -14,8 +14,8 @@ - - + +

    2 docs tagged with "balanced trees"

    View All Tags

    On the rules of the red-black tree

    Shower thoughts on the rules of the red-black tree. diff --git a/algorithms/tags/bellman-ford/index.html b/algorithms/tags/bellman-ford/index.html index 0db787c..709c488 100644 --- a/algorithms/tags/bellman-ford/index.html +++ b/algorithms/tags/bellman-ford/index.html @@ -14,8 +14,8 @@ - - + +

    2 docs tagged with "bellman ford"

    View All Tags

    BF

    Solving the shortest path problem with a naïve approach that turns into diff --git a/algorithms/tags/bfs/index.html b/algorithms/tags/bfs/index.html index c8471f1..5105829 100644 --- a/algorithms/tags/bfs/index.html +++ b/algorithms/tags/bfs/index.html @@ -14,8 +14,8 @@ - - + +

    One doc tagged with "bfs"

    View All Tags

    Distance boundaries from BFS tree on undirected graphs

    Short explanation of distance boundaries deduced from a BFS tree. diff --git a/algorithms/tags/bottom-up-dp/index.html b/algorithms/tags/bottom-up-dp/index.html index 1b8fbd9..bf08a91 100644 --- a/algorithms/tags/bottom-up-dp/index.html +++ b/algorithms/tags/bottom-up-dp/index.html @@ -14,8 +14,8 @@ - - + +

    2 docs tagged with "bottom-up-dp"

    View All Tags

    Bottom-up DP solution

    Bottom-up DP solution of the Pyramid Slide Down. diff --git a/algorithms/tags/brute-force/index.html b/algorithms/tags/brute-force/index.html index 99402b6..e78edc4 100644 --- a/algorithms/tags/brute-force/index.html +++ b/algorithms/tags/brute-force/index.html @@ -14,8 +14,8 @@ - - + +

    2 docs tagged with "brute force"

    View All Tags

    BF

    Solving the shortest path problem with a naïve approach that turns into diff --git a/algorithms/tags/c/index.html b/algorithms/tags/c/index.html index 2dc2c3b..0e47f72 100644 --- a/algorithms/tags/c/index.html +++ b/algorithms/tags/c/index.html @@ -14,8 +14,8 @@ - - + +

    One doc tagged with "c"

    View All Tags

    Time complexity of ‹extend›

    How to make inefficient algorithm unknowingly. diff --git a/algorithms/tags/cpp/index.html b/algorithms/tags/cpp/index.html index 55248d4..4ff827f 100644 --- a/algorithms/tags/cpp/index.html +++ b/algorithms/tags/cpp/index.html @@ -14,8 +14,8 @@ - - + +

    7 docs tagged with "cpp"

    View All Tags

    A* algorithm

    Moving from Dijkstra's algorithm into the A* algorithm. diff --git a/algorithms/tags/csharp/index.html b/algorithms/tags/csharp/index.html index 331f6de..35197c5 100644 --- a/algorithms/tags/csharp/index.html +++ b/algorithms/tags/csharp/index.html @@ -14,8 +14,8 @@ - - + +

    One doc tagged with "csharp"

    View All Tags

    Iterative algorithms via iterators

    Iterative DFS using iterators. diff --git a/algorithms/tags/dijkstra/index.html b/algorithms/tags/dijkstra/index.html index 5fbe7fe..0f6e3ec 100644 --- a/algorithms/tags/dijkstra/index.html +++ b/algorithms/tags/dijkstra/index.html @@ -14,8 +14,8 @@ - - + +

    2 docs tagged with "dijkstra"

    View All Tags

    Dijkstra's algorithm

    Moving from Bellman-Ford into the Dijsktra's algorithm. diff --git a/algorithms/tags/dynamic-array/index.html b/algorithms/tags/dynamic-array/index.html index ab148f6..7c9f1a8 100644 --- a/algorithms/tags/dynamic-array/index.html +++ b/algorithms/tags/dynamic-array/index.html @@ -14,8 +14,8 @@ - - + +

    One doc tagged with "dynamic array"

    View All Tags

    Time complexity of ‹extend›

    How to make inefficient algorithm unknowingly. diff --git a/algorithms/tags/dynamic-programming/index.html b/algorithms/tags/dynamic-programming/index.html index ed413fb..da64338 100644 --- a/algorithms/tags/dynamic-programming/index.html +++ b/algorithms/tags/dynamic-programming/index.html @@ -14,8 +14,8 @@ - - + +

    7 docs tagged with "dynamic programming"

    View All Tags

    A* algorithm

    Moving from Dijkstra's algorithm into the A* algorithm. diff --git a/algorithms/tags/exponential/index.html b/algorithms/tags/exponential/index.html index 5afddfb..3901036 100644 --- a/algorithms/tags/exponential/index.html +++ b/algorithms/tags/exponential/index.html @@ -14,8 +14,8 @@ - - + +

    2 docs tagged with "exponential"

    View All Tags

    Introduction to dynamic programming

    Solving a problem in different ways. diff --git a/algorithms/tags/graphs/index.html b/algorithms/tags/graphs/index.html index 5e892ca..0a4ed7a 100644 --- a/algorithms/tags/graphs/index.html +++ b/algorithms/tags/graphs/index.html @@ -14,8 +14,8 @@ - - + +

    2 docs tagged with "graphs"

    View All Tags

    Distance boundaries from BFS tree on undirected graphs

    Short explanation of distance boundaries deduced from a BFS tree. diff --git a/algorithms/tags/greedy/index.html b/algorithms/tags/greedy/index.html index ea84002..9421b8d 100644 --- a/algorithms/tags/greedy/index.html +++ b/algorithms/tags/greedy/index.html @@ -14,8 +14,8 @@ - - + +

    4 docs tagged with "greedy"

    View All Tags

    Dijkstra's algorithm

    Moving from Bellman-Ford into the Dijsktra's algorithm. diff --git a/algorithms/tags/hash-tables/index.html b/algorithms/tags/hash-tables/index.html index 04a5445..1572b83 100644 --- a/algorithms/tags/hash-tables/index.html +++ b/algorithms/tags/hash-tables/index.html @@ -14,8 +14,8 @@ - - + +

    3 docs tagged with "hash-tables"

    View All Tags

    Breaking hash table

    How to get the linear time complexity in a hash table. diff --git a/algorithms/tags/index.html b/algorithms/tags/index.html index a374c31..cc4636b 100644 --- a/algorithms/tags/index.html +++ b/algorithms/tags/index.html @@ -14,8 +14,8 @@ - - + +

    diff --git a/algorithms/tags/iterative/index.html b/algorithms/tags/iterative/index.html index f6e8fd0..ab29c32 100644 --- a/algorithms/tags/iterative/index.html +++ b/algorithms/tags/iterative/index.html @@ -14,8 +14,8 @@ - - + +

    One doc tagged with "iterative"

    View All Tags

    Iterative algorithms via iterators

    Iterative DFS using iterators. diff --git a/algorithms/tags/iterators/index.html b/algorithms/tags/iterators/index.html index bcdb3ba..c840a94 100644 --- a/algorithms/tags/iterators/index.html +++ b/algorithms/tags/iterators/index.html @@ -14,8 +14,8 @@ - - + +

    One doc tagged with "iterators"

    View All Tags

    Iterative algorithms via iterators

    Iterative DFS using iterators. diff --git a/algorithms/tags/java/index.html b/algorithms/tags/java/index.html index d87dc06..664622d 100644 --- a/algorithms/tags/java/index.html +++ b/algorithms/tags/java/index.html @@ -14,8 +14,8 @@ - - + +

    5 docs tagged with "java"

    View All Tags

    Bottom-up DP solution

    Bottom-up DP solution of the Pyramid Slide Down. diff --git a/algorithms/tags/karel/index.html b/algorithms/tags/karel/index.html index 9394687..3b3978c 100644 --- a/algorithms/tags/karel/index.html +++ b/algorithms/tags/karel/index.html @@ -14,8 +14,8 @@ - - + +

    2 docs tagged with "karel"

    View All Tags

    Recursion and backtracking with Robot Karel

    A problem with too many restrictions. diff --git a/algorithms/tags/postconditions/index.html b/algorithms/tags/postconditions/index.html index 1f974ee..e2ba26d 100644 --- a/algorithms/tags/postconditions/index.html +++ b/algorithms/tags/postconditions/index.html @@ -14,8 +14,8 @@ - - + +

    One doc tagged with "postconditions"

    View All Tags

    Vague postconditions and proving correctness of algorithms

    Debugging and testing with precise postconditions. diff --git a/algorithms/tags/python/index.html b/algorithms/tags/python/index.html index 60f3702..2f01c06 100644 --- a/algorithms/tags/python/index.html +++ b/algorithms/tags/python/index.html @@ -14,8 +14,8 @@ - - + +

    7 docs tagged with "python"

    View All Tags

    Breaking hash table

    How to get the linear time complexity in a hash table. diff --git a/algorithms/tags/recursion/index.html b/algorithms/tags/recursion/index.html index af08f0f..703aa06 100644 --- a/algorithms/tags/recursion/index.html +++ b/algorithms/tags/recursion/index.html @@ -14,8 +14,8 @@ - - + +

    5 docs tagged with "recursion"

    View All Tags

    Introduction to dynamic programming

    Solving a problem in different ways. diff --git a/algorithms/tags/red-black-trees/index.html b/algorithms/tags/red-black-trees/index.html index 3292777..3109e83 100644 --- a/algorithms/tags/red-black-trees/index.html +++ b/algorithms/tags/red-black-trees/index.html @@ -14,8 +14,8 @@ - - + +

    2 docs tagged with "red-black trees"

    View All Tags

    On the rules of the red-black tree

    Shower thoughts on the rules of the red-black tree. diff --git a/algorithms/tags/solution/index.html b/algorithms/tags/solution/index.html index ff90816..1508ec8 100644 --- a/algorithms/tags/solution/index.html +++ b/algorithms/tags/solution/index.html @@ -14,8 +14,8 @@ - - + +

    One doc tagged with "solution"

    View All Tags

    Solution to the problem

    Solving the problem introduced in the previous post. diff --git a/algorithms/tags/sorting/index.html b/algorithms/tags/sorting/index.html index 0b1a052..eb36689 100644 --- a/algorithms/tags/sorting/index.html +++ b/algorithms/tags/sorting/index.html @@ -14,8 +14,8 @@ - - + +

    One doc tagged with "sorting"

    View All Tags

    Vague postconditions and proving correctness of algorithms

    Debugging and testing with precise postconditions. diff --git a/algorithms/tags/testing/index.html b/algorithms/tags/testing/index.html index b011d05..5aca5a1 100644 --- a/algorithms/tags/testing/index.html +++ b/algorithms/tags/testing/index.html @@ -14,8 +14,8 @@ - - + +

    One doc tagged with "testing"

    View All Tags

    Vague postconditions and proving correctness of algorithms

    Debugging and testing with precise postconditions. diff --git a/algorithms/tags/time-complexity/index.html b/algorithms/tags/time-complexity/index.html index 7591fdb..f6ffbc8 100644 --- a/algorithms/tags/time-complexity/index.html +++ b/algorithms/tags/time-complexity/index.html @@ -14,8 +14,8 @@ - - + +

    One doc tagged with "time complexity"

    View All Tags

    Time complexity of ‹extend›

    How to make inefficient algorithm unknowingly. diff --git a/algorithms/tags/top-down-dp/index.html b/algorithms/tags/top-down-dp/index.html index 2a7740f..415b6da 100644 --- a/algorithms/tags/top-down-dp/index.html +++ b/algorithms/tags/top-down-dp/index.html @@ -14,8 +14,8 @@ - - + +

    2 docs tagged with "top-down-dp"

    View All Tags

    Introduction to dynamic programming

    Solving a problem in different ways. diff --git a/algorithms/time-complexity/extend/index.html b/algorithms/time-complexity/extend/index.html index 9163928..8759cde 100644 --- a/algorithms/time-complexity/extend/index.html +++ b/algorithms/time-complexity/extend/index.html @@ -16,8 +16,8 @@ - - + +

    Time complexity of ‹extend›

    Introduction

    diff --git a/assets/js/1535ede8.fa065562.js b/assets/js/1535ede8.325cfecf.js similarity index 99% rename from assets/js/1535ede8.fa065562.js rename to assets/js/1535ede8.325cfecf.js index e1e31f4..b1e2e32 100644 --- a/assets/js/1535ede8.fa065562.js +++ b/assets/js/1535ede8.325cfecf.js @@ -1 +1 @@ -"use strict";(self.webpackChunkfi=self.webpackChunkfi||[]).push([[5376],{44969:(e,n,t)=>{t.r(n),t.d(n,{assets:()=>c,contentTitle:()=>o,default:()=>h,frontMatter:()=>r,metadata:()=>a,toc:()=>l});var s=t(85893),i=t(11151);const r={id:"seminar-10",title:"10th seminar",description:"Finding bugs in a hangman.\n"},o=void 0,a={id:"bonuses/seminar-10",title:"10th seminar",description:"Finding bugs in a hangman.\n",source:"@site/c/bonuses/10.md",sourceDirName:"bonuses",slug:"/bonuses/seminar-10",permalink:"/c/bonuses/seminar-10",draft:!1,unlisted:!1,editUrl:"https://github.com/mfocko/blog/tree/main/c/bonuses/10.md",tags:[],version:"current",lastUpdatedAt:1718801795,formattedLastUpdatedAt:"Jun 19, 2024",frontMatter:{id:"seminar-10",title:"10th seminar",description:"Finding bugs in a hangman.\n"},sidebar:"autogeneratedBar",previous:{title:"8th seminar",permalink:"/c/bonuses/seminar-08"},next:{title:"Practice Exams",permalink:"/c/category/practice-exams"}},c={},l=[{value:"Introduction",id:"introduction",level:2},{value:"Project",id:"project",level:2},{value:"Summary of the gameplay",id:"summary-of-the-gameplay",level:3},{value:"Suggested workflow",id:"suggested-workflow",level:2},{value:"Tasks",id:"tasks",level:2},{value:"Dictionary",id:"dictionary",level:2},{value:"Submitting",id:"submitting",level:2}];function d(e){const n={a:"a",blockquote:"blockquote",code:"code",em:"em",h2:"h2",h3:"h3",hr:"hr",img:"img",li:"li",ol:"ol",p:"p",pre:"pre",table:"table",tbody:"tbody",td:"td",th:"th",thead:"thead",tr:"tr",ul:"ul",...(0,i.a)(),...e.components};return(0,s.jsxs)(s.Fragment,{children:[(0,s.jsx)(n.p,{children:(0,s.jsx)(n.a,{href:"pathname:///files/c/bonuses/10.tar.gz",children:"Source"})}),"\n",(0,s.jsx)(n.h2,{id:"introduction",children:"Introduction"}),"\n",(0,s.jsx)(n.p,{children:"For this bonus you are given almost finished project - The Hangman Game. Your\ntask is to try the game, in case you find any bugs point them out and cover as\nmuch of the game as possible with tests."}),"\n",(0,s.jsx)(n.p,{children:"For this bonus you can get at maximum 2 K\u20a1."}),"\n",(0,s.jsxs)(n.table,{children:[(0,s.jsx)(n.thead,{children:(0,s.jsxs)(n.tr,{children:[(0,s.jsx)(n.th,{children:"Item"}),(0,s.jsx)(n.th,{children:"Bonus"})]})}),(0,s.jsxs)(n.tbody,{children:[(0,s.jsxs)(n.tr,{children:[(0,s.jsx)(n.td,{children:"Fixing bugs from failing tests"}),(0,s.jsx)(n.td,{children:"0.25"})]}),(0,s.jsxs)(n.tr,{children:[(0,s.jsx)(n.td,{children:(0,s.jsx)(n.code,{children:"word_guessed"})}),(0,s.jsx)(n.td,{children:"0.50"})]}),(0,s.jsxs)(n.tr,{children:[(0,s.jsx)(n.td,{children:"Hidden bug"}),(0,s.jsx)(n.td,{children:"0.50"})]}),(0,s.jsxs)(n.tr,{children:[(0,s.jsx)(n.td,{children:"Extending tests, undetectable bugs or evil bug"}),(0,s.jsx)(n.td,{children:"0.37"})]}),(0,s.jsxs)(n.tr,{children:[(0,s.jsx)(n.td,{children:"Refactor"}),(0,s.jsx)(n.td,{children:"0.38"})]})]})]}),"\n",(0,s.jsx)(n.h2,{id:"project",children:"Project"}),"\n",(0,s.jsxs)(n.p,{children:["Project consists of 2 source files - ",(0,s.jsx)(n.code,{children:"hangman.c"})," and ",(0,s.jsx)(n.code,{children:"main.c"}),"."]}),"\n",(0,s.jsxs)(n.p,{children:[(0,s.jsx)(n.code,{children:"main.c"})," is quite short and concise, there is nothing for you to do."]}),"\n",(0,s.jsxs)(n.p,{children:[(0,s.jsx)(n.code,{children:"hangman.c"})," contains implementation of the game. In case you feel lost, consult\nthe documentation in ",(0,s.jsx)(n.code,{children:"hangman.h"})," that represents an interface that can be used\nfor implementing the game."]}),"\n",(0,s.jsxs)(n.p,{children:["Apart from those sources this project is a bit more complicated. ",(0,s.jsx)(n.em,{children:"Game loop"})," is\nrealised via single encapsulated function that complicates the testing. Because\nof that, there are 2 kinds of tests:"]}),"\n",(0,s.jsxs)(n.ul,{children:["\n",(0,s.jsxs)(n.li,{children:["\n",(0,s.jsxs)(n.p,{children:[(0,s.jsx)(n.em,{children:"Unit tests"})," - that are present in ",(0,s.jsx)(n.code,{children:"test_hangman.c"})," and can be run via:"]}),"\n",(0,s.jsx)(n.pre,{children:(0,s.jsx)(n.code,{children:"$ make check-unit\n"})}),"\n",(0,s.jsx)(n.p,{children:"They cover majorly functions that can be tested easily via testing framework."}),"\n"]}),"\n",(0,s.jsxs)(n.li,{children:["\n",(0,s.jsxs)(n.p,{children:[(0,s.jsx)(n.em,{children:"Functional tests"})," - same as in ",(0,s.jsx)(n.code,{children:"seminar-08"})," and are focused on testing the\nprogram as whole. Basic smoke test is already included in ",(0,s.jsx)(n.code,{children:"usage"})," test case."]}),"\n",(0,s.jsx)(n.p,{children:"They can be run via:"}),"\n",(0,s.jsx)(n.pre,{children:(0,s.jsx)(n.code,{children:"$ make check-functional\n"})}),"\n",(0,s.jsxs)(n.p,{children:["When testing ",(0,s.jsx)(n.code,{children:"hangman"})," function (the game loop), it is suggested to create\nfunctional tests."]}),"\n",(0,s.jsx)(n.p,{children:"When submitting the files for review, please leave out functional tests that\nwere given as a part of the assignment, so that it is easier to navigate, I\nwill drag the common files myself. :)"}),"\n"]}),"\n"]}),"\n",(0,s.jsxs)(n.blockquote,{children:["\n",(0,s.jsx)(n.p,{children:"Whole test suite can be run via:"}),"\n",(0,s.jsx)(n.pre,{children:(0,s.jsx)(n.code,{children:"$ make check\n"})}),"\n"]}),"\n",(0,s.jsx)(n.h3,{id:"summary-of-the-gameplay",children:"Summary of the gameplay"}),"\n",(0,s.jsxs)(n.ol,{children:["\n",(0,s.jsx)(n.li,{children:"Secret word gets chosen from the file that's path is given as an argument."}),"\n",(0,s.jsx)(n.li,{children:"You get 8 guesses."}),"\n",(0,s.jsx)(n.li,{children:"Invalid characters don't count."}),"\n",(0,s.jsx)(n.li,{children:"Already guessed characters don't count, even if not included in the secret."}),"\n",(0,s.jsxs)(n.li,{children:["You can guess the whole word at once","\n",(0,s.jsxs)(n.ul,{children:["\n",(0,s.jsx)(n.li,{children:"If you get it right, you won, game ends."}),"\n",(0,s.jsx)(n.li,{children:"If you don't get it right, you get to see the secret, game ends."}),"\n"]}),"\n"]}),"\n",(0,s.jsx)(n.li,{children:"In case of end of input, game finishes via force."}),"\n",(0,s.jsx)(n.li,{children:"In case of invalid input, no guesses are subtracted, game carries on."}),"\n",(0,s.jsx)(n.li,{children:"Letters and words are not case sensitive."}),"\n"]}),"\n",(0,s.jsx)(n.h2,{id:"suggested-workflow",children:"Suggested workflow"}),"\n",(0,s.jsxs)(n.p,{children:["As we have talked about on the seminar, I suggest you to follow\n",(0,s.jsx)(n.em,{children:"Test-Driven Development"}),"\nin this case."]}),"\n",(0,s.jsx)(n.p,{children:(0,s.jsx)(n.img,{alt:"TDD workflow",src:t(27420).Z+"",width:"2814",height:"1652"})}),"\n",(0,s.jsx)(n.p,{children:"In our current scenario we are already in the stage of refactoring and fixing the\nbugs. Therefore try to follow this succession of steps:"}),"\n",(0,s.jsxs)(n.ol,{children:["\n",(0,s.jsx)(n.li,{children:"Try to reproduce the bug."}),"\n",(0,s.jsx)(n.li,{children:"Create a test that proves the presence of the bug."}),"\n",(0,s.jsx)(n.li,{children:"Fix the bug."}),"\n"]}),"\n",(0,s.jsxs)(n.p,{children:["In case you are submitting the bonus via GitLab, it is helpful to commit tests\nbefore commiting the fixes, so that it is apparent that the bug is manifested.\nExample of ",(0,s.jsx)(n.code,{children:"git log"})," (notice that the first line represents latest commit):"]}),"\n",(0,s.jsx)(n.pre,{children:(0,s.jsx)(n.code,{children:"feat: Implement fizz_buzzer\ntest: Add tests for fizz_buzzer\nfix: Fix NULL-check in print_name\ntest: Add test for NULL in print_name\n"})}),"\n",(0,s.jsx)(n.h2,{id:"tasks",children:"Tasks"}),"\n",(0,s.jsx)(n.p,{children:"As to your tasks, there are multiple things wrong in this project."}),"\n",(0,s.jsxs)(n.ol,{children:["\n",(0,s.jsxs)(n.li,{children:["\n",(0,s.jsx)(n.p,{children:'There are 2 "bugs" that cannot be detected via tests, i.e. they are not bugs\nthat affect functionality of the game.'}),"\n"]}),"\n",(0,s.jsxs)(n.li,{children:["\n",(0,s.jsxs)(n.p,{children:["There is one evil bug in ",(0,s.jsx)(n.code,{children:"get_word"}),". It is not required to be fixed ;) Assign\nit the lowest priority."]}),"\n"]}),"\n",(0,s.jsxs)(n.li,{children:["\n",(0,s.jsx)(n.p,{children:"There are some tests failing. Please try to figure it out, so you have green\ntests for the rest :)"}),"\n"]}),"\n",(0,s.jsxs)(n.li,{children:["\n",(0,s.jsxs)(n.p,{children:["We have gotten a bug report for ",(0,s.jsx)(n.code,{children:"word_guessed"}),", all we got is"]}),"\n",(0,s.jsxs)(n.blockquote,{children:["\n",(0,s.jsxs)(n.p,{children:["doesn't work when there are too many ",(0,s.jsx)(n.code,{children:"a"}),"s"]}),"\n"]}),"\n",(0,s.jsx)(n.p,{children:"Please try to replicate the bug and create a tests, so we don't get any\nregression later on."}),"\n"]}),"\n",(0,s.jsxs)(n.li,{children:["\n",(0,s.jsx)(n.p,{children:"One hidden bug :) Closely non-specified, we cannot reproduce it and we were\ndrunk while playing the game, so we don't remember a thing. :/"}),"\n"]}),"\n",(0,s.jsxs)(n.li,{children:["\n",(0,s.jsx)(n.p,{children:"Try to cover as much code via the tests as possible. We are not going to look\nat the metrics, but DRY is violated a lot, so as a last task try to remove as\nmuch of the duplicit code as possible."}),"\n",(0,s.jsx)(n.p,{children:"Tests should help you a lot in case there are some regressions."}),"\n"]}),"\n"]}),"\n",(0,s.jsx)(n.hr,{}),"\n",(0,s.jsxs)(n.p,{children:["In case you wonder why there are always 3 same words in the file with words, it\nis because of the ",(0,s.jsx)(n.code,{children:"get_word"})," bug. It is not a bug that can be easily fixed, so\nit is a not requirement at all and you can still get all points for the bonus ;)"]}),"\n",(0,s.jsx)(n.h2,{id:"dictionary",children:"Dictionary"}),"\n",(0,s.jsxs)(n.ul,{children:["\n",(0,s.jsx)(n.li,{children:(0,s.jsx)(n.a,{href:"https://en.wikipedia.org/wiki/Functional_testing",children:"Functional tests"})}),"\n",(0,s.jsx)(n.li,{children:(0,s.jsx)(n.a,{href:"https://en.wikipedia.org/wiki/Smoke_testing_%28software%29",children:"Smoke test"})}),"\n",(0,s.jsx)(n.li,{children:(0,s.jsx)(n.a,{href:"https://en.wikipedia.org/wiki/Don%27t_repeat_yourself",children:"DRY"})}),"\n"]}),"\n",(0,s.jsx)(n.h2,{id:"submitting",children:"Submitting"}),"\n",(0,s.jsx)(n.p,{children:"In case you have any questions, feel free to reach out to me."}),"\n",(0,s.jsx)(n.hr,{})]})}function h(e={}){const{wrapper:n}={...(0,i.a)(),...e.components};return n?(0,s.jsx)(n,{...e,children:(0,s.jsx)(d,{...e})}):d(e)}},27420:(e,n,t)=>{t.d(n,{Z:()=>s});const s=t.p+"assets/images/tdd_lifecycle-327ad9ee0ed8318ed11e19a28e02b2cc.png"},11151:(e,n,t)=>{t.d(n,{Z:()=>a,a:()=>o});var s=t(67294);const i={},r=s.createContext(i);function o(e){const n=s.useContext(r);return s.useMemo((function(){return"function"==typeof e?e(n):{...n,...e}}),[n,e])}function a(e){let n;return n=e.disableParentContext?"function"==typeof e.components?e.components(i):e.components||i:o(e.components),s.createElement(r.Provider,{value:n},e.children)}}}]); \ No newline at end of file +"use strict";(self.webpackChunkfi=self.webpackChunkfi||[]).push([[5376],{44969:(e,n,t)=>{t.r(n),t.d(n,{assets:()=>c,contentTitle:()=>o,default:()=>h,frontMatter:()=>r,metadata:()=>a,toc:()=>l});var s=t(85893),i=t(11151);const r={id:"seminar-10",title:"10th seminar",description:"Finding bugs in a hangman.\n"},o=void 0,a={id:"bonuses/seminar-10",title:"10th seminar",description:"Finding bugs in a hangman.\n",source:"@site/c/bonuses/10.md",sourceDirName:"bonuses",slug:"/bonuses/seminar-10",permalink:"/c/bonuses/seminar-10",draft:!1,unlisted:!1,editUrl:"https://github.com/mfocko/blog/tree/main/c/bonuses/10.md",tags:[],version:"current",lastUpdatedAt:1718813674,formattedLastUpdatedAt:"Jun 19, 2024",frontMatter:{id:"seminar-10",title:"10th seminar",description:"Finding bugs in a hangman.\n"},sidebar:"autogeneratedBar",previous:{title:"8th seminar",permalink:"/c/bonuses/seminar-08"},next:{title:"Practice Exams",permalink:"/c/category/practice-exams"}},c={},l=[{value:"Introduction",id:"introduction",level:2},{value:"Project",id:"project",level:2},{value:"Summary of the gameplay",id:"summary-of-the-gameplay",level:3},{value:"Suggested workflow",id:"suggested-workflow",level:2},{value:"Tasks",id:"tasks",level:2},{value:"Dictionary",id:"dictionary",level:2},{value:"Submitting",id:"submitting",level:2}];function d(e){const n={a:"a",blockquote:"blockquote",code:"code",em:"em",h2:"h2",h3:"h3",hr:"hr",img:"img",li:"li",ol:"ol",p:"p",pre:"pre",table:"table",tbody:"tbody",td:"td",th:"th",thead:"thead",tr:"tr",ul:"ul",...(0,i.a)(),...e.components};return(0,s.jsxs)(s.Fragment,{children:[(0,s.jsx)(n.p,{children:(0,s.jsx)(n.a,{href:"pathname:///files/c/bonuses/10.tar.gz",children:"Source"})}),"\n",(0,s.jsx)(n.h2,{id:"introduction",children:"Introduction"}),"\n",(0,s.jsx)(n.p,{children:"For this bonus you are given almost finished project - The Hangman Game. Your\ntask is to try the game, in case you find any bugs point them out and cover as\nmuch of the game as possible with tests."}),"\n",(0,s.jsx)(n.p,{children:"For this bonus you can get at maximum 2 K\u20a1."}),"\n",(0,s.jsxs)(n.table,{children:[(0,s.jsx)(n.thead,{children:(0,s.jsxs)(n.tr,{children:[(0,s.jsx)(n.th,{children:"Item"}),(0,s.jsx)(n.th,{children:"Bonus"})]})}),(0,s.jsxs)(n.tbody,{children:[(0,s.jsxs)(n.tr,{children:[(0,s.jsx)(n.td,{children:"Fixing bugs from failing tests"}),(0,s.jsx)(n.td,{children:"0.25"})]}),(0,s.jsxs)(n.tr,{children:[(0,s.jsx)(n.td,{children:(0,s.jsx)(n.code,{children:"word_guessed"})}),(0,s.jsx)(n.td,{children:"0.50"})]}),(0,s.jsxs)(n.tr,{children:[(0,s.jsx)(n.td,{children:"Hidden bug"}),(0,s.jsx)(n.td,{children:"0.50"})]}),(0,s.jsxs)(n.tr,{children:[(0,s.jsx)(n.td,{children:"Extending tests, undetectable bugs or evil bug"}),(0,s.jsx)(n.td,{children:"0.37"})]}),(0,s.jsxs)(n.tr,{children:[(0,s.jsx)(n.td,{children:"Refactor"}),(0,s.jsx)(n.td,{children:"0.38"})]})]})]}),"\n",(0,s.jsx)(n.h2,{id:"project",children:"Project"}),"\n",(0,s.jsxs)(n.p,{children:["Project consists of 2 source files - ",(0,s.jsx)(n.code,{children:"hangman.c"})," and ",(0,s.jsx)(n.code,{children:"main.c"}),"."]}),"\n",(0,s.jsxs)(n.p,{children:[(0,s.jsx)(n.code,{children:"main.c"})," is quite short and concise, there is nothing for you to do."]}),"\n",(0,s.jsxs)(n.p,{children:[(0,s.jsx)(n.code,{children:"hangman.c"})," contains implementation of the game. In case you feel lost, consult\nthe documentation in ",(0,s.jsx)(n.code,{children:"hangman.h"})," that represents an interface that can be used\nfor implementing the game."]}),"\n",(0,s.jsxs)(n.p,{children:["Apart from those sources this project is a bit more complicated. ",(0,s.jsx)(n.em,{children:"Game loop"})," is\nrealised via single encapsulated function that complicates the testing. Because\nof that, there are 2 kinds of tests:"]}),"\n",(0,s.jsxs)(n.ul,{children:["\n",(0,s.jsxs)(n.li,{children:["\n",(0,s.jsxs)(n.p,{children:[(0,s.jsx)(n.em,{children:"Unit tests"})," - that are present in ",(0,s.jsx)(n.code,{children:"test_hangman.c"})," and can be run via:"]}),"\n",(0,s.jsx)(n.pre,{children:(0,s.jsx)(n.code,{children:"$ make check-unit\n"})}),"\n",(0,s.jsx)(n.p,{children:"They cover majorly functions that can be tested easily via testing framework."}),"\n"]}),"\n",(0,s.jsxs)(n.li,{children:["\n",(0,s.jsxs)(n.p,{children:[(0,s.jsx)(n.em,{children:"Functional tests"})," - same as in ",(0,s.jsx)(n.code,{children:"seminar-08"})," and are focused on testing the\nprogram as whole. Basic smoke test is already included in ",(0,s.jsx)(n.code,{children:"usage"})," test case."]}),"\n",(0,s.jsx)(n.p,{children:"They can be run via:"}),"\n",(0,s.jsx)(n.pre,{children:(0,s.jsx)(n.code,{children:"$ make check-functional\n"})}),"\n",(0,s.jsxs)(n.p,{children:["When testing ",(0,s.jsx)(n.code,{children:"hangman"})," function (the game loop), it is suggested to create\nfunctional tests."]}),"\n",(0,s.jsx)(n.p,{children:"When submitting the files for review, please leave out functional tests that\nwere given as a part of the assignment, so that it is easier to navigate, I\nwill drag the common files myself. :)"}),"\n"]}),"\n"]}),"\n",(0,s.jsxs)(n.blockquote,{children:["\n",(0,s.jsx)(n.p,{children:"Whole test suite can be run via:"}),"\n",(0,s.jsx)(n.pre,{children:(0,s.jsx)(n.code,{children:"$ make check\n"})}),"\n"]}),"\n",(0,s.jsx)(n.h3,{id:"summary-of-the-gameplay",children:"Summary of the gameplay"}),"\n",(0,s.jsxs)(n.ol,{children:["\n",(0,s.jsx)(n.li,{children:"Secret word gets chosen from the file that's path is given as an argument."}),"\n",(0,s.jsx)(n.li,{children:"You get 8 guesses."}),"\n",(0,s.jsx)(n.li,{children:"Invalid characters don't count."}),"\n",(0,s.jsx)(n.li,{children:"Already guessed characters don't count, even if not included in the secret."}),"\n",(0,s.jsxs)(n.li,{children:["You can guess the whole word at once","\n",(0,s.jsxs)(n.ul,{children:["\n",(0,s.jsx)(n.li,{children:"If you get it right, you won, game ends."}),"\n",(0,s.jsx)(n.li,{children:"If you don't get it right, you get to see the secret, game ends."}),"\n"]}),"\n"]}),"\n",(0,s.jsx)(n.li,{children:"In case of end of input, game finishes via force."}),"\n",(0,s.jsx)(n.li,{children:"In case of invalid input, no guesses are subtracted, game carries on."}),"\n",(0,s.jsx)(n.li,{children:"Letters and words are not case sensitive."}),"\n"]}),"\n",(0,s.jsx)(n.h2,{id:"suggested-workflow",children:"Suggested workflow"}),"\n",(0,s.jsxs)(n.p,{children:["As we have talked about on the seminar, I suggest you to follow\n",(0,s.jsx)(n.em,{children:"Test-Driven Development"}),"\nin this case."]}),"\n",(0,s.jsx)(n.p,{children:(0,s.jsx)(n.img,{alt:"TDD workflow",src:t(27420).Z+"",width:"2814",height:"1652"})}),"\n",(0,s.jsx)(n.p,{children:"In our current scenario we are already in the stage of refactoring and fixing the\nbugs. Therefore try to follow this succession of steps:"}),"\n",(0,s.jsxs)(n.ol,{children:["\n",(0,s.jsx)(n.li,{children:"Try to reproduce the bug."}),"\n",(0,s.jsx)(n.li,{children:"Create a test that proves the presence of the bug."}),"\n",(0,s.jsx)(n.li,{children:"Fix the bug."}),"\n"]}),"\n",(0,s.jsxs)(n.p,{children:["In case you are submitting the bonus via GitLab, it is helpful to commit tests\nbefore commiting the fixes, so that it is apparent that the bug is manifested.\nExample of ",(0,s.jsx)(n.code,{children:"git log"})," (notice that the first line represents latest commit):"]}),"\n",(0,s.jsx)(n.pre,{children:(0,s.jsx)(n.code,{children:"feat: Implement fizz_buzzer\ntest: Add tests for fizz_buzzer\nfix: Fix NULL-check in print_name\ntest: Add test for NULL in print_name\n"})}),"\n",(0,s.jsx)(n.h2,{id:"tasks",children:"Tasks"}),"\n",(0,s.jsx)(n.p,{children:"As to your tasks, there are multiple things wrong in this project."}),"\n",(0,s.jsxs)(n.ol,{children:["\n",(0,s.jsxs)(n.li,{children:["\n",(0,s.jsx)(n.p,{children:'There are 2 "bugs" that cannot be detected via tests, i.e. they are not bugs\nthat affect functionality of the game.'}),"\n"]}),"\n",(0,s.jsxs)(n.li,{children:["\n",(0,s.jsxs)(n.p,{children:["There is one evil bug in ",(0,s.jsx)(n.code,{children:"get_word"}),". It is not required to be fixed ;) Assign\nit the lowest priority."]}),"\n"]}),"\n",(0,s.jsxs)(n.li,{children:["\n",(0,s.jsx)(n.p,{children:"There are some tests failing. Please try to figure it out, so you have green\ntests for the rest :)"}),"\n"]}),"\n",(0,s.jsxs)(n.li,{children:["\n",(0,s.jsxs)(n.p,{children:["We have gotten a bug report for ",(0,s.jsx)(n.code,{children:"word_guessed"}),", all we got is"]}),"\n",(0,s.jsxs)(n.blockquote,{children:["\n",(0,s.jsxs)(n.p,{children:["doesn't work when there are too many ",(0,s.jsx)(n.code,{children:"a"}),"s"]}),"\n"]}),"\n",(0,s.jsx)(n.p,{children:"Please try to replicate the bug and create a tests, so we don't get any\nregression later on."}),"\n"]}),"\n",(0,s.jsxs)(n.li,{children:["\n",(0,s.jsx)(n.p,{children:"One hidden bug :) Closely non-specified, we cannot reproduce it and we were\ndrunk while playing the game, so we don't remember a thing. :/"}),"\n"]}),"\n",(0,s.jsxs)(n.li,{children:["\n",(0,s.jsx)(n.p,{children:"Try to cover as much code via the tests as possible. We are not going to look\nat the metrics, but DRY is violated a lot, so as a last task try to remove as\nmuch of the duplicit code as possible."}),"\n",(0,s.jsx)(n.p,{children:"Tests should help you a lot in case there are some regressions."}),"\n"]}),"\n"]}),"\n",(0,s.jsx)(n.hr,{}),"\n",(0,s.jsxs)(n.p,{children:["In case you wonder why there are always 3 same words in the file with words, it\nis because of the ",(0,s.jsx)(n.code,{children:"get_word"})," bug. It is not a bug that can be easily fixed, so\nit is a not requirement at all and you can still get all points for the bonus ;)"]}),"\n",(0,s.jsx)(n.h2,{id:"dictionary",children:"Dictionary"}),"\n",(0,s.jsxs)(n.ul,{children:["\n",(0,s.jsx)(n.li,{children:(0,s.jsx)(n.a,{href:"https://en.wikipedia.org/wiki/Functional_testing",children:"Functional tests"})}),"\n",(0,s.jsx)(n.li,{children:(0,s.jsx)(n.a,{href:"https://en.wikipedia.org/wiki/Smoke_testing_%28software%29",children:"Smoke test"})}),"\n",(0,s.jsx)(n.li,{children:(0,s.jsx)(n.a,{href:"https://en.wikipedia.org/wiki/Don%27t_repeat_yourself",children:"DRY"})}),"\n"]}),"\n",(0,s.jsx)(n.h2,{id:"submitting",children:"Submitting"}),"\n",(0,s.jsx)(n.p,{children:"In case you have any questions, feel free to reach out to me."}),"\n",(0,s.jsx)(n.hr,{})]})}function h(e={}){const{wrapper:n}={...(0,i.a)(),...e.components};return n?(0,s.jsx)(n,{...e,children:(0,s.jsx)(d,{...e})}):d(e)}},27420:(e,n,t)=>{t.d(n,{Z:()=>s});const s=t.p+"assets/images/tdd_lifecycle-327ad9ee0ed8318ed11e19a28e02b2cc.png"},11151:(e,n,t)=>{t.d(n,{Z:()=>a,a:()=>o});var s=t(67294);const i={},r=s.createContext(i);function o(e){const n=s.useContext(r);return s.useMemo((function(){return"function"==typeof e?e(n):{...n,...e}}),[n,e])}function a(e){let n;return n=e.disableParentContext?"function"==typeof e.components?e.components(i):e.components||i:o(e.components),s.createElement(r.Provider,{value:n},e.children)}}}]); \ No newline at end of file diff --git a/assets/js/2d2e3e59.3df3ae6c.js b/assets/js/2d2e3e59.3df3ae6c.js new file mode 100644 index 0000000..6aba46e --- /dev/null +++ b/assets/js/2d2e3e59.3df3ae6c.js @@ -0,0 +1 @@ +"use strict";(self.webpackChunkfi=self.webpackChunkfi||[]).push([[6689],{55268:(e,t,n)=>{n.r(t),n.d(t,{assets:()=>h,contentTitle:()=>i,default:()=>d,frontMatter:()=>o,metadata:()=>r,toc:()=>l});var a=n(85893),s=n(11151);const o={title:"DevConf.cz 2024",description:"Sharing my experience on DevConf.cz 2024.\n",date:new Date("2024-06-19T00:00:00.000Z"),authors:[{key:"mf",title:"a.k.a. exhausted DevConf attendee"}],tags:["\ud83c\udfed","red-hat","fedora","devconf","conferences"]},i=void 0,r={permalink:"/blog/2024/06/19/devconf-2024",editUrl:"https://github.com/mfocko/blog/tree/main/blog/2024-06-19-devconf-2024.md",source:"@site/blog/2024-06-19-devconf-2024.md",title:"DevConf.cz 2024",description:"Sharing my experience on DevConf.cz 2024.\n",date:"2024-06-19T00:00:00.000Z",formattedDate:"June 19, 2024",tags:[{label:"\ud83c\udfed",permalink:"/blog/tags/\ud83c\udfed"},{label:"red-hat",permalink:"/blog/tags/red-hat"},{label:"fedora",permalink:"/blog/tags/fedora"},{label:"devconf",permalink:"/blog/tags/devconf"},{label:"conferences",permalink:"/blog/tags/conferences"}],readingTime:5.36,hasTruncateMarker:!0,authors:[{name:"Matej Focko",email:"me+blog@mfocko.xyz",title:"a.k.a. exhausted DevConf attendee",url:"https://gitlab.com/mfocko",imageURL:"https://github.com/mfocko.png",key:"mf"}],frontMatter:{title:"DevConf.cz 2024",description:"Sharing my experience on DevConf.cz 2024.\n",date:"2024-06-19T00:00:00.000Z",authors:[{key:"mf",title:"a.k.a. exhausted DevConf attendee"}],tags:["\ud83c\udfed","red-hat","fedora","devconf","conferences"]},unlisted:!1,nextItem:{title:"LTS distributions",permalink:"/blog/2024/02/07/lts-distros"}},h={authorsImageUrls:[void 0]},l=[{value:"Day 1",id:"day-1",level:2},{value:"Day 2",id:"day-2",level:2},{value:"Day 3",id:"day-3",level:2},{value:"Picks from the Packit Team",id:"picks-from-the-packit-team",level:2},{value:"Wrap up",id:"wrap-up",level:2}];function c(e){const t={a:"a",blockquote:"blockquote",em:"em",h2:"h2",img:"img",li:"li",ol:"ol",p:"p",section:"section",strong:"strong",sup:"sup",ul:"ul",...(0,s.a)(),...e.components};return(0,a.jsxs)(a.Fragment,{children:[(0,a.jsx)(t.p,{children:"I'd like to share my experience and views on some of the talks that I've\nattended on the DevConf.cz 2024."}),"\n",(0,a.jsx)(t.h2,{id:"day-1",children:"Day 1"}),"\n",(0,a.jsx)(t.p,{children:"Let's start with the first day which was Thursday this year as opposed to the\nprevious years when the conference started on Friday and finished on Sunday."}),"\n",(0,a.jsxs)(t.p,{children:["Let's start with the ",(0,a.jsx)(t.em,{children:(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/AD3HWR/",children:"keynote"})}),". The keynote wasn't very interesting, at some of\nthe slides actually felt like advertisement for other talks on the topic of the\nAI\u2026"]}),"\n",(0,a.jsxs)(t.p,{children:["Next talk about ",(0,a.jsx)(t.em,{children:(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/3UKGLB/",children:"event-driven Ansible"})})," was way more interesting. It allows you\nto run Ansible playbooks after provisioning hosts, or on certain events, such as\ndiscovered vulnerabilities. On one hand it feels like a very nice thing, but on\nthe other one I can't help but to think how you need to write the playbooks, so\nthat they are generic enough. One more example that's been given comes from the\npossibility to react to tickets, e.g., outages and this feels like something\nthat could be abused to cause DoS."]}),"\n",(0,a.jsxs)(t.p,{children:["Afterwards we've seen two lightning talks, one about\n",(0,a.jsx)(t.em,{children:(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/KSDRWL/",children:"choosing the right OpenShift size"})})," which was a pretty quick, but listed all\nof the possible ways you can deploy OpenShift in detail. This lightning talk\nwas followed by the first AI (lightning) talk I've attended about\n",(0,a.jsx)(t.em,{children:(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/H9QFLM/",children:"rapid prototyping"})})," of the open-source AI models."]}),"\n",(0,a.jsxs)(t.p,{children:["As someone who's involved in the automation of the RPM packaging and testing, of\ncourse, we had to attend ",(0,a.jsx)(t.em,{children:(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/NNKT3F/",children:"Learning from Nix"})}),". Nix has a very intriguing\nconcept which is pretty powerful, but painful at the same time. This can be\nsummed up pretty nicely by ",(0,a.jsx)(t.a,{href:"https://twitch.tv/tsoding",children:"Tsoding"})," who got asked about some tips & tricks for\nsomeone who wants to try out NixOS:"]}),"\n",(0,a.jsxs)(t.blockquote,{children:["\n",(0,a.jsx)(t.p,{children:"Just don't."}),"\n"]}),"\n",(0,a.jsx)(t.p,{children:"And now we're moving into a section where everything revolves about the Packit\nTeam :)"}),"\n",(0,a.jsxs)(t.p,{children:["First talk about ",(0,a.jsx)(t.em,{children:(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/ECU7QS/",children:"changelogs"})})," was an interactive session that was (probably)\nmeant to share different approaches we take to handle this rather convoluted\ntopic that involves changelogs on both upstream and also on downstream with no\nrules",(0,a.jsx)(t.sup,{children:(0,a.jsx)(t.a,{href:"#user-content-fn-1-ba1b92",id:"user-content-fnref-1-ba1b92","data-footnote-ref":!0,"aria-describedby":"footnote-label",children:"1"})}),"."]}),"\n",(0,a.jsx)(t.p,{children:(0,a.jsx)(t.img,{src:"https://i.imgur.com/YHstMAt.jpg",alt:"changelogs"})}),"\n",(0,a.jsxs)(t.p,{children:["Next one was about ",(0,a.jsx)(t.em,{children:(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/7C38GJ/",children:"static analysis"})})," done by ",(0,a.jsx)(t.a,{href:"https://openscanhub.dev/",children:"OpenScanHub"}),". I like the idea of\nrunning the static analysis that can uncover nasty bugs (as it has been even\nshowed in the talk) at the same time as they are introduced. I gotta admit that\nafter seeing the UI of the ",(0,a.jsx)(t.a,{href:"https://openscanhub.fedoraproject.org/",children:"deployed OpenScanHub"})," on the Fedora Infra, I couldn't\nhelp but to think about the ",(0,a.jsx)(t.a,{href:"https://x.com/usgraphics",children:"United States Graphics Company"})," ","\ud83d\ude04"," The UI is\nto the point, no fancy annoying shit, you get what you need, it's hard to get\nlost. ",(0,a.jsx)(t.strong,{children:"Just simplicity."})," Best kind of UI/UX in my opinion."]}),"\n",(0,a.jsxs)(t.p,{children:["After the OpenScanHub talk we're getting to talks that were taken in a totally\ndifferent direction from the usual talks you're used to ","\ud83d\ude09"," First one was\ngiven title of ",(0,a.jsx)(t.em,{children:(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/X8SYDG/",children:"\u201cIndiana Jones and obsoleted projects\u201d"})})," by ",(0,a.jsx)(t.a,{href:"https://rodina-sucha.cz/@mirek",children:"Mirek"}),". He talked\nabout projects that got obsoleted, but started with projects that had no\nrelation to IT field at all. I'd mark this talk as a ",(0,a.jsx)(t.em,{children:"stand up"})," without any\nhesitation."]}),"\n",(0,a.jsxs)(t.p,{children:["And finally we will wrap up the first day with the talk where speakers spoke the\nleast\u2026 ",(0,a.jsx)(t.em,{children:(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/BDMWF3/",children:"\u201cLet the users speak!\u201d"})})," that involved users of both Packit and\nTesting Farm who spoke about their use case and benefits they gained from using\nboth services in a symbiosis."]}),"\n",(0,a.jsx)(t.h2,{id:"day-2",children:"Day 2"}),"\n",(0,a.jsxs)(t.p,{children:["On the second day I've attended less talks to not burn myself out :) I've\nstarted with an AI-related talk with title ",(0,a.jsx)(t.em,{children:(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/QSF9QQ/",children:"\u201cAI: Open source will save us!\u201d"})}),",\neven though this talk has been improvised, as the speakers from the schedule\ncouldn't have attended, it provided a nice overview what ",(0,a.jsx)(t.a,{href:"https://github.com/instructlab/instructlab",children:"InstructLab"})," can do\nand how can you \u201cfeed\u201d the relevant info into the language models by yourself."]}),"\n",(0,a.jsxs)(t.p,{children:["After that I attended a ",(0,a.jsx)(t.em,{children:"\u201ccoffee enthusiasts Meetup\u201d"})," which was very nice and,\nof course, an organized chaos ","\ud83d\ude09"]}),"\n",(0,a.jsxs)(t.p,{children:["Before attending the social event I wrapped up the second day with a lightning\ntalk about ",(0,a.jsx)(t.em,{children:(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/SXWE7K/",children:"recent updates in Toolbx"})}),". I've used both ",(0,a.jsx)(t.a,{href:"https://containertoolbx.org/",children:"toolbx"})," and\n",(0,a.jsx)(t.a,{href:"https://distrobox.it/",children:"distrobox"}),", so it's nice to see the improvements in progress and also that both\nprojects are well and lively."]}),"\n",(0,a.jsx)(t.h2,{id:"day-3",children:"Day 3"}),"\n",(0,a.jsxs)(t.p,{children:["On the third day I've attended only in the afternoon. \u201cStarted\u201d my day with\na discussion ",(0,a.jsx)(t.em,{children:(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/8PARM8/",children:"\u201cLeadership: Where people skills meet programmers\u201d"})})," which was\nvery nice for gaining an insight into how developer, manager and QE lead roles\noverlap."]}),"\n",(0,a.jsxs)(t.p,{children:["That talk has been followed up by a talk about ",(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/8T88MT/",children:"role rotation"})," in our Packit\nTeam. I would say it is a nice \u201cupgrade\u201d to the agile process which allows you\nto not create a single point of failure in the mundane and repetitive processes\nwithin your team."]}),"\n",(0,a.jsxs)(t.p,{children:["And this day has been finished off with a talk about ",(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/WVNJZS/",children:"shifting left"})," in Podman.\nIt's nice to see how other teams utilize our Packit Service and also the\nservices we rely on, such as ",(0,a.jsx)(t.a,{href:"https://copr.fedorainfracloud.org/",children:"Copr"})," or ",(0,a.jsx)(t.a,{href:"https://docs.testing-farm.io/Testing%20Farm/0.1/index.html",children:"Testing Farm"}),". With the help of Cockpit\ntests they can catch breaking changes early on, or even bugs that have been\nintroduced and break usage of the dependent projects."]}),"\n",(0,a.jsx)(t.p,{children:(0,a.jsx)(t.img,{src:"https://i.imgur.com/bp6FxT9.jpg",alt:"shifting left"})}),"\n",(0,a.jsx)(t.h2,{id:"picks-from-the-packit-team",children:"Picks from the Packit Team"}),"\n",(0,a.jsx)(t.p,{children:"On the Tuesday, during our Packit stand up, I have managed to abuse my\nKanban Lead role to collect some of the talks that each of us would recommend:"}),"\n",(0,a.jsxs)(t.ul,{children:["\n",(0,a.jsxs)(t.li,{children:[(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/H9QFLM/",children:"Rapid Prototyping"})," with Open Source AI Models"]}),"\n",(0,a.jsxs)(t.li,{children:["Do you like your ",(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/ECU7QS/",children:"changelogs"}),"?"]}),"\n",(0,a.jsxs)(t.li,{children:["OpenScanHub - ",(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/7C38GJ/",children:"Static Analysis"})," of a Linux Distribution"]}),"\n",(0,a.jsxs)(t.li,{children:["Creating a ",(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/RXKMKA/",children:"Language Server"})," for RPM Spec Files"]}),"\n",(0,a.jsxs)(t.li,{children:["Containers and Kubernetes Made Easy: A 15-minute dive into ",(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/HKWP7V/",children:"Podman Desktop"})]}),"\n",(0,a.jsx)(t.li,{children:(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/8PARM8/",children:"\u201cLeadership: Where people skills meet programmers\u201d"})}),"\n"]}),"\n",(0,a.jsx)(t.h2,{id:"wrap-up",children:"Wrap up"}),"\n",(0,a.jsxs)(t.p,{children:["I have to admit that these 3 days have been pretty exhaustive, including\ninformation overload ","\ud83d\ude04"," but at the same time it was really nice to meet\nwith the colleagues and at least some of our users who are not based in Brno."]}),"\n",(0,a.jsxs)(t.section,{"data-footnotes":!0,className:"footnotes",children:[(0,a.jsx)(t.h2,{className:"sr-only",id:"footnote-label",children:"Footnotes"}),"\n",(0,a.jsxs)(t.ol,{children:["\n",(0,a.jsxs)(t.li,{id:"user-content-fn-1-ba1b92",children:["\n",(0,a.jsxs)(t.p,{children:["except for the Fedora's downstream ;) ",(0,a.jsx)(t.a,{href:"#user-content-fnref-1-ba1b92","data-footnote-backref":"","aria-label":"Back to reference 1",className:"data-footnote-backref",children:"\u21a9"})]}),"\n"]}),"\n"]}),"\n"]})]})}function d(e={}){const{wrapper:t}={...(0,s.a)(),...e.components};return t?(0,a.jsx)(t,{...e,children:(0,a.jsx)(c,{...e})}):c(e)}},11151:(e,t,n)=>{n.d(t,{Z:()=>r,a:()=>i});var a=n(67294);const s={},o=a.createContext(s);function i(e){const t=a.useContext(o);return a.useMemo((function(){return"function"==typeof e?e(t):{...t,...e}}),[t,e])}function r(e){let t;return t=e.disableParentContext?"function"==typeof e.components?e.components(s):e.components||s:i(e.components),a.createElement(o.Provider,{value:t},e.children)}}}]); \ No newline at end of file diff --git a/assets/js/2d2e3e59.8602c0c2.js b/assets/js/2d2e3e59.8602c0c2.js deleted file mode 100644 index 190d244..0000000 --- a/assets/js/2d2e3e59.8602c0c2.js +++ /dev/null @@ -1 +0,0 @@ -"use strict";(self.webpackChunkfi=self.webpackChunkfi||[]).push([[6689],{55268:(e,t,n)=>{n.r(t),n.d(t,{assets:()=>h,contentTitle:()=>i,default:()=>d,frontMatter:()=>o,metadata:()=>r,toc:()=>l});var a=n(85893),s=n(11151);const o={title:"DevConf.cz 2024",description:"Sharing my experience on DevConf.cz 2024.\n",date:new Date("2024-06-19T00:00:00.000Z"),authors:[{key:"mf",title:"a.k.a. exhausted DevConf attendee"}],tags:["\ud83c\udfed","red-hat","fedora","devconf","conferences"]},i=void 0,r={permalink:"/blog/2024/06/19/devconf-2024",editUrl:"https://github.com/mfocko/blog/tree/main/blog/2024-06-19-devconf-2024.md",source:"@site/blog/2024-06-19-devconf-2024.md",title:"DevConf.cz 2024",description:"Sharing my experience on DevConf.cz 2024.\n",date:"2024-06-19T00:00:00.000Z",formattedDate:"June 19, 2024",tags:[{label:"\ud83c\udfed",permalink:"/blog/tags/\ud83c\udfed"},{label:"red-hat",permalink:"/blog/tags/red-hat"},{label:"fedora",permalink:"/blog/tags/fedora"},{label:"devconf",permalink:"/blog/tags/devconf"},{label:"conferences",permalink:"/blog/tags/conferences"}],readingTime:5.355,hasTruncateMarker:!0,authors:[{name:"Matej Focko",email:"me+blog@mfocko.xyz",title:"a.k.a. exhausted DevConf attendee",url:"https://gitlab.com/mfocko",imageURL:"https://github.com/mfocko.png",key:"mf"}],frontMatter:{title:"DevConf.cz 2024",description:"Sharing my experience on DevConf.cz 2024.\n",date:"2024-06-19T00:00:00.000Z",authors:[{key:"mf",title:"a.k.a. exhausted DevConf attendee"}],tags:["\ud83c\udfed","red-hat","fedora","devconf","conferences"]},unlisted:!1,nextItem:{title:"LTS distributions",permalink:"/blog/2024/02/07/lts-distros"}},h={authorsImageUrls:[void 0]},l=[{value:"Day 1",id:"day-1",level:2},{value:"Day 2",id:"day-2",level:2},{value:"Day 3",id:"day-3",level:2},{value:"Picks from the Packit Team",id:"picks-from-the-packit-team",level:2},{value:"Wrap up",id:"wrap-up",level:2}];function c(e){const t={a:"a",blockquote:"blockquote",em:"em",h2:"h2",img:"img",li:"li",ol:"ol",p:"p",section:"section",strong:"strong",sup:"sup",ul:"ul",...(0,s.a)(),...e.components};return(0,a.jsxs)(a.Fragment,{children:[(0,a.jsx)(t.p,{children:"I'd like to share my experience and views on some of the talks that I've\nattended on the DevConf.cz 2024."}),"\n",(0,a.jsx)(t.h2,{id:"day-1",children:"Day 1"}),"\n",(0,a.jsx)(t.p,{children:"Let's start with the first day which was Thursday this year as opposed to the\nprevious years when the conference started on Friday and finished on Sunday."}),"\n",(0,a.jsxs)(t.p,{children:["Let's start with the ",(0,a.jsx)(t.em,{children:(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/AD3HWR/",children:"keynote"})}),". The keynote wasn't very intersting, at some of\nthe slides actually felt like advertisement for other talks on the topic of the\nAI\u2026"]}),"\n",(0,a.jsxs)(t.p,{children:["Next talk about ",(0,a.jsx)(t.em,{children:(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/3UKGLB/",children:"event-driven Ansible"})})," was way more interesting. It allows you\nto run Ansible playbooks after provisioning hosts, or on certain events, such as\ndiscovered vulnerabilities. On one hand it feels like a very nice thing, but on\nthe other one I can't help but to think how you need to write the playbooks, so\nthat they are generic enough. One more example that's been given comes from the\npossibility to react to tickets, e.g., outages and this feels like something\nthat could be abused to cause DoS."]}),"\n",(0,a.jsxs)(t.p,{children:["Afterwards we've seen two lightning talks, one about\n",(0,a.jsx)(t.em,{children:(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/KSDRWL/",children:"choosing the right OpenShift size"})})," which was a pretty quick, but detailly\nlisted all of the possible ways you can deploy OpenShift. This lightning talk\nwas followed by the first AI (lightning) talk I've attended about\n",(0,a.jsx)(t.em,{children:(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/H9QFLM/",children:"rapid prototyping"})})," of the open-source AI models."]}),"\n",(0,a.jsxs)(t.p,{children:["As someone who's involved in the automation of the RPM packaging and testing, of\ncourse, we had to attend ",(0,a.jsx)(t.em,{children:(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/NNKT3F/",children:"Learning from Nix"})}),". Nix has a very intriguing\nconcept which is pretty powerful, but painful at the same time. This can be\nsummed up pretty nicely by ",(0,a.jsx)(t.a,{href:"https://twitch.tv/tsoding",children:"Tsoding"})," who got asked about some tips & tricks for\nsomeone who wants to try out NixOS:"]}),"\n",(0,a.jsxs)(t.blockquote,{children:["\n",(0,a.jsx)(t.p,{children:"Just don't."}),"\n"]}),"\n",(0,a.jsx)(t.p,{children:"And now we're moving into a section where everything revolves about the Packit\nTeam :)"}),"\n",(0,a.jsxs)(t.p,{children:["First talk about ",(0,a.jsx)(t.em,{children:(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/ECU7QS/",children:"changelogs"})})," was an interactive session that was (probably)\nmeant to share different approaches we take to handle this rather convoluted\ntopic that involves changelogs on both upstream and also on downstream with no\nrules",(0,a.jsx)(t.sup,{children:(0,a.jsx)(t.a,{href:"#user-content-fn-1-ba1b92",id:"user-content-fnref-1-ba1b92","data-footnote-ref":!0,"aria-describedby":"footnote-label",children:"1"})}),"."]}),"\n",(0,a.jsx)(t.p,{children:(0,a.jsx)(t.img,{src:"https://i.imgur.com/YHstMAt.jpg",alt:"changelogs"})}),"\n",(0,a.jsxs)(t.p,{children:["Next one was about ",(0,a.jsx)(t.em,{children:(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/7C38GJ/",children:"static analysis"})})," done by ",(0,a.jsx)(t.a,{href:"https://openscanhub.dev/",children:"OpenScanHub"}),". I like the idea of\nrunning the static analysis that can uncover nasty bugs (as it has been even\nshowed in the talk) at the same time as they are introduced. I gotta admit that\nafter seeing the UI of the ",(0,a.jsx)(t.a,{href:"https://openscanhub.fedoraproject.org/",children:"deployed OpenScanHub"})," on the Fedora Infra, I couldn't\nhelp but to think about the ",(0,a.jsx)(t.a,{href:"https://x.com/usgraphics",children:"United States Graphics Company"})," ","\ud83d\ude04"," The UI is\nto the point, no fancy annoying shit, you get what you need, it's hard to get\nlost. ",(0,a.jsx)(t.strong,{children:"Just simplicity."})," Best kind of UI/UX in my opinion."]}),"\n",(0,a.jsxs)(t.p,{children:["After the OpenScanHub talk we're getting to talks that were taken in a totally\ndifferent direction from the usual talks you're used to ","\ud83d\ude09"," First one was\ngiven title of ",(0,a.jsx)(t.em,{children:(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/X8SYDG/",children:"\u201cIndiana Jones and obsoleted projects\u201d"})})," by ",(0,a.jsx)(t.a,{href:"https://rodina-sucha.cz/@mirek",children:"Mirek"}),". He talked\nabout projects that got obsoleted, but started with projects that had no\nrelation to IT field at all. I'd mark this talk as a ",(0,a.jsx)(t.em,{children:"stand up"})," without any\nhesitation."]}),"\n",(0,a.jsxs)(t.p,{children:["And finally we will wrap up the first day with the talk where speakers spoke the\nleast\u2026 ",(0,a.jsx)(t.em,{children:(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/BDMWF3/",children:"\u201cLet the users speak!\u201d"})})," that involved users of both Packit and\nTesting Farm who spoke about their use case and benefits they gained from using\nboth services in a symbiosis."]}),"\n",(0,a.jsx)(t.h2,{id:"day-2",children:"Day 2"}),"\n",(0,a.jsxs)(t.p,{children:["On the second day I've attended less talks to not burn myself out :) I've\nstarted with an AI-related talk with title ",(0,a.jsx)(t.em,{children:(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/QSF9QQ/",children:"\u201cAI: Open source will save us!\u201d"})}),",\neven though this talk has been improvised, as the speakers from the schedule\ncouldn't have attended, it provided a nice overview what ",(0,a.jsx)(t.a,{href:"https://github.com/instructlab/instructlab",children:"InstructLab"})," can do\nand how can you \u201cfeed\u201d the relevant info into the language models by yourself."]}),"\n",(0,a.jsxs)(t.p,{children:["After that I attended a ",(0,a.jsx)(t.em,{children:"\u201ccoffee enthusiasts Meetup\u201d"})," which was very nice and,\nof course, an organized chaos ","\ud83d\ude09"]}),"\n",(0,a.jsxs)(t.p,{children:["Before attending the social event I wrapped up the second day with a lightning\ntalk about ",(0,a.jsx)(t.em,{children:(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/SXWE7K/",children:"recent updates in Toolbx"})}),". I've used both ",(0,a.jsx)(t.a,{href:"https://containertoolbx.org/",children:"toolbx"})," and\n",(0,a.jsx)(t.a,{href:"https://distrobox.it/",children:"distrobox"}),", so it's nice to see the improvements in progress and also that both\nprojects are well and lively."]}),"\n",(0,a.jsx)(t.h2,{id:"day-3",children:"Day 3"}),"\n",(0,a.jsxs)(t.p,{children:["On the third day I've attended only in the afternoon. \u201cStarted\u201d my day with\na discussion ",(0,a.jsx)(t.em,{children:(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/8PARM8/",children:"\u201cLeadership: Where people skills meet programmers\u201d"})})," which was\nvery nice for gaining an insight into how developer, manager and QE lead roles\noverlap."]}),"\n",(0,a.jsxs)(t.p,{children:["That talk has been followed up by a talk about ",(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/8T88MT/",children:"role rotation"})," in our Packit\nTeam. I would say it is a nice \u201cupgrade\u201d to the agile process which allows you\nto not create a single point of failure in the mundane and repetitive processes\nwithin your team."]}),"\n",(0,a.jsxs)(t.p,{children:["And this day has been finished off with a talk about ",(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/WVNJZS/",children:"shifting left"})," in Podman.\nIt's nice to see how other teams utilize our Packit Service and also the\nservices we rely on, such as ",(0,a.jsx)(t.a,{href:"https://copr.fedorainfracloud.org/",children:"Copr"})," or ",(0,a.jsx)(t.a,{href:"https://docs.testing-farm.io/Testing%20Farm/0.1/index.html",children:"Testing Farm"}),". With the help of Cockpit\ntests they can catch breaking changes early on, or even bugs that have been\nintroduced and break usage of the dependent projects."]}),"\n",(0,a.jsx)(t.p,{children:(0,a.jsx)(t.img,{src:"https://i.imgur.com/bp6FxT9.jpg",alt:"shifting left"})}),"\n",(0,a.jsx)(t.h2,{id:"picks-from-the-packit-team",children:"Picks from the Packit Team"}),"\n",(0,a.jsx)(t.p,{children:"On the Tuesday, during our Packit stand up, I have managed to abuse my\nKanban Lead role to collect some of the talks that each of us would recommend:"}),"\n",(0,a.jsxs)(t.ul,{children:["\n",(0,a.jsxs)(t.li,{children:[(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/H9QFLM/",children:"Rapid Prototyping"})," with Open Source AI Models"]}),"\n",(0,a.jsxs)(t.li,{children:["Do you like your ",(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/ECU7QS/",children:"changelogs"}),"?"]}),"\n",(0,a.jsxs)(t.li,{children:["OpenScanHub - ",(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/7C38GJ/",children:"Static Analysis"})," of a Linux Distribution"]}),"\n",(0,a.jsxs)(t.li,{children:["Creating a ",(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/RXKMKA/",children:"Language Server"})," for RPM Spec Files"]}),"\n",(0,a.jsxs)(t.li,{children:["Containers and Kubernetes Made Easy: A 15-minute dive into ",(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/HKWP7V/",children:"Podman Desktop"})]}),"\n",(0,a.jsx)(t.li,{children:(0,a.jsx)(t.a,{href:"https://pretalx.com/devconf-cz-2024/talk/8PARM8/",children:"\u201cLeadership: Where people skills meet programmers\u201d"})}),"\n"]}),"\n",(0,a.jsx)(t.h2,{id:"wrap-up",children:"Wrap up"}),"\n",(0,a.jsxs)(t.p,{children:["I have to admit that these 3 days have been pretty exhaustive, including\ninformation overload ","\ud83d\ude04"," but at the same time it was really nice to meet\nwith the colleagues and at least some of our users who are not based in Brno."]}),"\n",(0,a.jsxs)(t.section,{"data-footnotes":!0,className:"footnotes",children:[(0,a.jsx)(t.h2,{className:"sr-only",id:"footnote-label",children:"Footnotes"}),"\n",(0,a.jsxs)(t.ol,{children:["\n",(0,a.jsxs)(t.li,{id:"user-content-fn-1-ba1b92",children:["\n",(0,a.jsxs)(t.p,{children:["except for the Fedora's downstream ;) ",(0,a.jsx)(t.a,{href:"#user-content-fnref-1-ba1b92","data-footnote-backref":"","aria-label":"Back to reference 1",className:"data-footnote-backref",children:"\u21a9"})]}),"\n"]}),"\n"]}),"\n"]})]})}function d(e={}){const{wrapper:t}={...(0,s.a)(),...e.components};return t?(0,a.jsx)(t,{...e,children:(0,a.jsx)(c,{...e})}):c(e)}},11151:(e,t,n)=>{n.d(t,{Z:()=>r,a:()=>i});var a=n(67294);const s={},o=a.createContext(s);function i(e){const t=a.useContext(o);return a.useMemo((function(){return"function"==typeof e?e(t):{...t,...e}}),[t,e])}function r(e){let t;return t=e.disableParentContext?"function"==typeof e.components?e.components(s):e.components||s:i(e.components),a.createElement(o.Provider,{value:t},e.children)}}}]); \ No newline at end of file diff --git a/assets/js/3716fece.604cd631.js b/assets/js/3716fece.604cd631.js new file mode 100644 index 0000000..fc2020f --- /dev/null +++ b/assets/js/3716fece.604cd631.js @@ -0,0 +1 @@ +"use strict";(self.webpackChunkfi=self.webpackChunkfi||[]).push([[1511],{76225:(e,t,n)=>{n.r(t),n.d(t,{assets:()=>i,contentTitle:()=>c,default:()=>f,frontMatter:()=>r,metadata:()=>s,toc:()=>l});var o=n(85893),a=n(11151);const r={title:"DevConf.cz 2024",description:"Sharing my experience on DevConf.cz 2024.\n",date:new Date("2024-06-19T00:00:00.000Z"),authors:[{key:"mf",title:"a.k.a. exhausted DevConf attendee"}],tags:["\ud83c\udfed","red-hat","fedora","devconf","conferences"]},c=void 0,s={permalink:"/blog/2024/06/19/devconf-2024",editUrl:"https://github.com/mfocko/blog/tree/main/blog/2024-06-19-devconf-2024.md",source:"@site/blog/2024-06-19-devconf-2024.md",title:"DevConf.cz 2024",description:"Sharing my experience on DevConf.cz 2024.\n",date:"2024-06-19T00:00:00.000Z",formattedDate:"June 19, 2024",tags:[{label:"\ud83c\udfed",permalink:"/blog/tags/\ud83c\udfed"},{label:"red-hat",permalink:"/blog/tags/red-hat"},{label:"fedora",permalink:"/blog/tags/fedora"},{label:"devconf",permalink:"/blog/tags/devconf"},{label:"conferences",permalink:"/blog/tags/conferences"}],readingTime:5.36,hasTruncateMarker:!0,authors:[{name:"Matej Focko",email:"me+blog@mfocko.xyz",title:"a.k.a. exhausted DevConf attendee",url:"https://gitlab.com/mfocko",imageURL:"https://github.com/mfocko.png",key:"mf"}],frontMatter:{title:"DevConf.cz 2024",description:"Sharing my experience on DevConf.cz 2024.\n",date:"2024-06-19T00:00:00.000Z",authors:[{key:"mf",title:"a.k.a. exhausted DevConf attendee"}],tags:["\ud83c\udfed","red-hat","fedora","devconf","conferences"]},unlisted:!1,nextItem:{title:"LTS distributions",permalink:"/blog/2024/02/07/lts-distros"}},i={authorsImageUrls:[void 0]},l=[];function d(e){const t={p:"p",...(0,a.a)(),...e.components};return(0,o.jsx)(t.p,{children:"I'd like to share my experience and views on some of the talks that I've\nattended on the DevConf.cz 2024."})}function f(e={}){const{wrapper:t}={...(0,a.a)(),...e.components};return t?(0,o.jsx)(t,{...e,children:(0,o.jsx)(d,{...e})}):d(e)}},11151:(e,t,n)=>{n.d(t,{Z:()=>s,a:()=>c});var o=n(67294);const a={},r=o.createContext(a);function c(e){const t=o.useContext(r);return o.useMemo((function(){return"function"==typeof e?e(t):{...t,...e}}),[t,e])}function s(e){let t;return t=e.disableParentContext?"function"==typeof e.components?e.components(a):e.components||a:c(e.components),o.createElement(r.Provider,{value:t},e.children)}}}]); \ No newline at end of file diff --git a/assets/js/3716fece.62d9d597.js b/assets/js/3716fece.62d9d597.js deleted file mode 100644 index e044bea..0000000 --- a/assets/js/3716fece.62d9d597.js +++ /dev/null @@ -1 +0,0 @@ -"use strict";(self.webpackChunkfi=self.webpackChunkfi||[]).push([[1511],{76225:(e,t,n)=>{n.r(t),n.d(t,{assets:()=>i,contentTitle:()=>c,default:()=>f,frontMatter:()=>r,metadata:()=>s,toc:()=>l});var o=n(85893),a=n(11151);const r={title:"DevConf.cz 2024",description:"Sharing my experience on DevConf.cz 2024.\n",date:new Date("2024-06-19T00:00:00.000Z"),authors:[{key:"mf",title:"a.k.a. exhausted DevConf attendee"}],tags:["\ud83c\udfed","red-hat","fedora","devconf","conferences"]},c=void 0,s={permalink:"/blog/2024/06/19/devconf-2024",editUrl:"https://github.com/mfocko/blog/tree/main/blog/2024-06-19-devconf-2024.md",source:"@site/blog/2024-06-19-devconf-2024.md",title:"DevConf.cz 2024",description:"Sharing my experience on DevConf.cz 2024.\n",date:"2024-06-19T00:00:00.000Z",formattedDate:"June 19, 2024",tags:[{label:"\ud83c\udfed",permalink:"/blog/tags/\ud83c\udfed"},{label:"red-hat",permalink:"/blog/tags/red-hat"},{label:"fedora",permalink:"/blog/tags/fedora"},{label:"devconf",permalink:"/blog/tags/devconf"},{label:"conferences",permalink:"/blog/tags/conferences"}],readingTime:5.355,hasTruncateMarker:!0,authors:[{name:"Matej Focko",email:"me+blog@mfocko.xyz",title:"a.k.a. exhausted DevConf attendee",url:"https://gitlab.com/mfocko",imageURL:"https://github.com/mfocko.png",key:"mf"}],frontMatter:{title:"DevConf.cz 2024",description:"Sharing my experience on DevConf.cz 2024.\n",date:"2024-06-19T00:00:00.000Z",authors:[{key:"mf",title:"a.k.a. exhausted DevConf attendee"}],tags:["\ud83c\udfed","red-hat","fedora","devconf","conferences"]},unlisted:!1,nextItem:{title:"LTS distributions",permalink:"/blog/2024/02/07/lts-distros"}},i={authorsImageUrls:[void 0]},l=[];function d(e){const t={p:"p",...(0,a.a)(),...e.components};return(0,o.jsx)(t.p,{children:"I'd like to share my experience and views on some of the talks that I've\nattended on the DevConf.cz 2024."})}function f(e={}){const{wrapper:t}={...(0,a.a)(),...e.components};return t?(0,o.jsx)(t,{...e,children:(0,o.jsx)(d,{...e})}):d(e)}},11151:(e,t,n)=>{n.d(t,{Z:()=>s,a:()=>c});var o=n(67294);const a={},r=o.createContext(a);function c(e){const t=o.useContext(r);return o.useMemo((function(){return"function"==typeof e?e(t):{...t,...e}}),[t,e])}function s(e){let t;return t=e.disableParentContext?"function"==typeof e.components?e.components(a):e.components||a:c(e.components),o.createElement(r.Provider,{value:t},e.children)}}}]); \ No newline at end of file diff --git a/assets/js/4200b1a9.1ca93be5.js b/assets/js/4200b1a9.1ca93be5.js deleted file mode 100644 index bb7e7ae..0000000 --- a/assets/js/4200b1a9.1ca93be5.js +++ /dev/null @@ -1 +0,0 @@ -"use strict";(self.webpackChunkfi=self.webpackChunkfi||[]).push([[866],{24612:e=>{e.exports=JSON.parse('{"blogPosts":[{"id":"/2024/06/19/devconf-2024","metadata":{"permalink":"/blog/2024/06/19/devconf-2024","editUrl":"https://github.com/mfocko/blog/tree/main/blog/2024-06-19-devconf-2024.md","source":"@site/blog/2024-06-19-devconf-2024.md","title":"DevConf.cz 2024","description":"Sharing my experience on DevConf.cz 2024.\\n","date":"2024-06-19T00:00:00.000Z","formattedDate":"June 19, 2024","tags":[{"label":"\ud83c\udfed","permalink":"/blog/tags/\ud83c\udfed"},{"label":"red-hat","permalink":"/blog/tags/red-hat"},{"label":"fedora","permalink":"/blog/tags/fedora"},{"label":"devconf","permalink":"/blog/tags/devconf"},{"label":"conferences","permalink":"/blog/tags/conferences"}],"readingTime":5.355,"hasTruncateMarker":true,"authors":[{"name":"Matej Focko","email":"me+blog@mfocko.xyz","title":"a.k.a. exhausted DevConf attendee","url":"https://gitlab.com/mfocko","imageURL":"https://github.com/mfocko.png","key":"mf"}],"frontMatter":{"title":"DevConf.cz 2024","description":"Sharing my experience on DevConf.cz 2024.\\n","date":"2024-06-19T00:00:00.000Z","authors":[{"key":"mf","title":"a.k.a. exhausted DevConf attendee"}],"tags":["\ud83c\udfed","red-hat","fedora","devconf","conferences"]},"unlisted":false,"nextItem":{"title":"LTS distributions","permalink":"/blog/2024/02/07/lts-distros"}},"content":"I\'d like to share my experience and views on some of the talks that I\'ve\\nattended on the DevConf.cz 2024.\\n\\n\x3c!--truncate--\x3e\\n\\n## Day 1\\n\\nLet\'s start with the first day which was Thursday this year as opposed to the\\nprevious years when the conference started on Friday and finished on Sunday.\\n\\nLet\'s start with the _[keynote]_. The keynote wasn\'t very intersting, at some of\\nthe slides actually felt like advertisement for other talks on the topic of the\\nAI\u2026\\n\\nNext talk about _[event-driven Ansible]_ was way more interesting. It allows you\\nto run Ansible playbooks after provisioning hosts, or on certain events, such as\\ndiscovered vulnerabilities. On one hand it feels like a very nice thing, but on\\nthe other one I can\'t help but to think how you need to write the playbooks, so\\nthat they are generic enough. One more example that\'s been given comes from the\\npossibility to react to tickets, e.g., outages and this feels like something\\nthat could be abused to cause DoS.\\n\\nAfterwards we\'ve seen two lightning talks, one about\\n_[choosing the right OpenShift size]_ which was a pretty quick, but detailly\\nlisted all of the possible ways you can deploy OpenShift. This lightning talk\\nwas followed by the first AI (lightning) talk I\'ve attended about\\n_[rapid prototyping]_ of the open-source AI models.\\n\\nAs someone who\'s involved in the automation of the RPM packaging and testing, of\\ncourse, we had to attend _[Learning from Nix]_. Nix has a very intriguing\\nconcept which is pretty powerful, but painful at the same time. This can be\\nsummed up pretty nicely by [Tsoding] who got asked about some tips & tricks for\\nsomeone who wants to try out NixOS:\\n\\n> Just don\'t.\\n\\nAnd now we\'re moving into a section where everything revolves about the Packit\\nTeam :)\\n\\nFirst talk about _[changelogs]_ was an interactive session that was (probably)\\nmeant to share different approaches we take to handle this rather convoluted\\ntopic that involves changelogs on both upstream and also on downstream with no\\nrules[^1].\\n\\n![changelogs](https://i.imgur.com/YHstMAt.jpg)\\n\\nNext one was about _[static analysis]_ done by [OpenScanHub]. I like the idea of\\nrunning the static analysis that can uncover nasty bugs (as it has been even\\nshowed in the talk) at the same time as they are introduced. I gotta admit that\\nafter seeing the UI of the [deployed OpenScanHub] on the Fedora Infra, I couldn\'t\\nhelp but to think about the [United States Graphics Company] :smile: The UI is\\nto the point, no fancy annoying shit, you get what you need, it\'s hard to get\\nlost. **Just simplicity.** Best kind of UI/UX in my opinion.\\n\\nAfter the OpenScanHub talk we\'re getting to talks that were taken in a totally\\ndifferent direction from the usual talks you\'re used to :wink: First one was\\ngiven title of _[\u201cIndiana Jones and obsoleted projects\u201d]_ by [Mirek]. He talked\\nabout projects that got obsoleted, but started with projects that had no\\nrelation to IT field at all. I\'d mark this talk as a _stand up_ without any\\nhesitation.\\n\\nAnd finally we will wrap up the first day with the talk where speakers spoke the\\nleast\u2026 _[\u201cLet the users speak!\u201d]_ that involved users of both Packit and\\nTesting Farm who spoke about their use case and benefits they gained from using\\nboth services in a symbiosis.\\n\\n## Day 2\\n\\nOn the second day I\'ve attended less talks to not burn myself out :) I\'ve\\nstarted with an AI-related talk with title _[\u201cAI: Open source will save us!\u201d]_,\\neven though this talk has been improvised, as the speakers from the schedule\\ncouldn\'t have attended, it provided a nice overview what [InstructLab] can do\\nand how can you \u201cfeed\u201d the relevant info into the language models by yourself.\\n\\nAfter that I attended a _\u201ccoffee enthusiasts Meetup\u201d_ which was very nice and,\\nof course, an organized chaos :wink:\\n\\nBefore attending the social event I wrapped up the second day with a lightning\\ntalk about _[recent updates in Toolbx]_. I\'ve used both [toolbx] and\\n[distrobox], so it\'s nice to see the improvements in progress and also that both\\nprojects are well and lively.\\n\\n## Day 3\\n\\nOn the third day I\'ve attended only in the afternoon. \u201cStarted\u201d my day with\\na discussion _[\u201cLeadership: Where people skills meet programmers\u201d]_ which was\\nvery nice for gaining an insight into how developer, manager and QE lead roles\\noverlap.\\n\\nThat talk has been followed up by a talk about [role rotation] in our Packit\\nTeam. I would say it is a nice \u201cupgrade\u201d to the agile process which allows you\\nto not create a single point of failure in the mundane and repetitive processes\\nwithin your team.\\n\\nAnd this day has been finished off with a talk about [shifting left] in Podman.\\nIt\'s nice to see how other teams utilize our Packit Service and also the\\nservices we rely on, such as [Copr] or [Testing Farm]. With the help of Cockpit\\ntests they can catch breaking changes early on, or even bugs that have been\\nintroduced and break usage of the dependent projects.\\n\\n![shifting left](https://i.imgur.com/bp6FxT9.jpg)\\n\\n## Picks from the Packit Team\\n\\nOn the Tuesday, during our Packit stand up, I have managed to abuse my\\nKanban Lead role to collect some of the talks that each of us would recommend:\\n\\n- [Rapid Prototyping] with Open Source AI Models\\n- Do you like your [changelogs]?\\n- OpenScanHub - [Static Analysis] of a Linux Distribution\\n- Creating a [Language Server] for RPM Spec Files\\n- Containers and Kubernetes Made Easy: A 15-minute dive into [Podman Desktop]\\n- [\u201cLeadership: Where people skills meet programmers\u201d]\\n\\n## Wrap up\\n\\nI have to admit that these 3 days have been pretty exhaustive, including\\ninformation overload :smile: but at the same time it was really nice to meet\\nwith the colleagues and at least some of our users who are not based in Brno.\\n\\n[^1]: except for the Fedora\'s downstream ;)\\n\\n[keynote]: https://pretalx.com/devconf-cz-2024/talk/AD3HWR/\\n[event-driven ansible]: https://pretalx.com/devconf-cz-2024/talk/3UKGLB/\\n[choosing the right openshift size]: https://pretalx.com/devconf-cz-2024/talk/KSDRWL/\\n[rapid prototyping]: https://pretalx.com/devconf-cz-2024/talk/H9QFLM/\\n[learning from nix]: https://pretalx.com/devconf-cz-2024/talk/NNKT3F/\\n[tsoding]: https://twitch.tv/tsoding\\n[changelogs]: https://pretalx.com/devconf-cz-2024/talk/ECU7QS/\\n[static analysis]: https://pretalx.com/devconf-cz-2024/talk/7C38GJ/\\n[openscanhub]: https://openscanhub.dev/\\n[deployed openscanhub]: https://openscanhub.fedoraproject.org/\\n[united states graphics company]: https://x.com/usgraphics\\n[\u201cindiana jones and obsoleted projects\u201d]: https://pretalx.com/devconf-cz-2024/talk/X8SYDG/\\n[mirek]: https://rodina-sucha.cz/@mirek\\n[\u201clet the users speak!\u201d]: https://pretalx.com/devconf-cz-2024/talk/BDMWF3/\\n[\u201cai: open source will save us!\u201d]: https://pretalx.com/devconf-cz-2024/talk/QSF9QQ/\\n[instructlab]: https://github.com/instructlab/instructlab\\n[recent updates in toolbx]: https://pretalx.com/devconf-cz-2024/talk/SXWE7K/\\n[toolbx]: https://containertoolbx.org/\\n[distrobox]: https://distrobox.it/\\n[\u201cleadership: where people skills meet programmers\u201d]: https://pretalx.com/devconf-cz-2024/talk/8PARM8/\\n[role rotation]: https://pretalx.com/devconf-cz-2024/talk/8T88MT/\\n[shifting left]: https://pretalx.com/devconf-cz-2024/talk/WVNJZS/\\n[copr]: https://copr.fedorainfracloud.org/\\n[testing farm]: https://docs.testing-farm.io/Testing%20Farm/0.1/index.html\\n[language server]: https://pretalx.com/devconf-cz-2024/talk/RXKMKA/\\n[podman desktop]: https://pretalx.com/devconf-cz-2024/talk/HKWP7V/"},{"id":"/2024/02/07/lts-distros","metadata":{"permalink":"/blog/2024/02/07/lts-distros","editUrl":"https://github.com/mfocko/blog/tree/main/blog/2024-02-07-lts-distros.md","source":"@site/blog/2024-02-07-lts-distros.md","title":"LTS distributions","description":"Shower thoughts on the LTS Linux distributions.\\n","date":"2024-02-07T00:00:00.000Z","formattedDate":"February 7, 2024","tags":[{"label":"lts","permalink":"/blog/tags/lts"},{"label":"linux distributions","permalink":"/blog/tags/linux-distributions"},{"label":"support","permalink":"/blog/tags/support"},{"label":"paywall","permalink":"/blog/tags/paywall"}],"readingTime":14.515,"hasTruncateMarker":true,"authors":[{"name":"Matej Focko","email":"me+blog@mfocko.xyz","title":"a.k.a. small Fedora maintainer","url":"https://gitlab.com/mfocko","imageURL":"https://github.com/mfocko.png","key":"mf"}],"frontMatter":{"title":"LTS distributions","description":"Shower thoughts on the LTS Linux distributions.\\n","date":"2024-02-07T00:00:00.000Z","authors":[{"key":"mf","title":"a.k.a. small Fedora maintainer"}],"tags":["lts","linux distributions","support","paywall"],"hide_table_of_contents":false},"unlisted":false,"prevItem":{"title":"DevConf.cz 2024","permalink":"/blog/2024/06/19/devconf-2024"},"nextItem":{"title":"Mixed feelings on Rust","permalink":"/blog/2024/01/28/rust-opinion"}},"content":"Linux distributions are a common choice for running the servers. There\'s a wide\\nvariety of distributions, but on the servers majority is made by only a few.\\n\\nSome corporations also profit from the support of the \u201cbig\u201d distributions. Let\'s\\ndive into the pros, cons and peculiarities of such _business_.\\n\\nThis post is inspired/triggered by the following Mastodon post:\\n[![Mastodon post about Ubuntu Pro](https://i.imgur.com/mh5RAlV.png)](https://hackers.town/@antijingoist/111864760073049505)\\n\\n\x3c!--truncate--\x3e\\n\\n:::caution Disclaimer\\n\\nYou may take my opinion with a grain of salt, since I\'m affiliated with Red Hat,\\nbut at the same time I\'ve also seen the other side of the fence, so I know how\\nit works from the perspective of the provider/maintainer.\\n\\n:::\\n\\n:::tip\\n\\nIf you are not very oriented in the matters of Linux distributions and\\nmaintaining of packages, I suggest looking at the [glossary](#glossary) at the\\nend to have a better grasp of the terms that are used throughout the post.\\n\\n:::\\n\\n## Point of linux distributions\\n\\nFirst thing I\'d like to point out is the point of the Linux distributions. What\\nbenefit do they provide? And why there are so many of them\u2026\\n\\nAs it has been brought up many times by the _rms_[^1], Linux by itself is not\\nenough, it\'s just the kernel that does the underlying work. We need more\\nsoftware to utilize the hardware. That\'s the gap that Linux distributions bridge\\nby providing the Linux and much more other software that we need.\\n\\nEach distribution is unique in its own way. Some prefer different ways of\\nhandling the software (like Gentoo that allows you to compile it yourself) and\\nothers stable releases of software (like Debian).\\n\\nIn the end it mostly boils down to the packaging. I, as a user, want to do\\nsomething like\\n\\n```\\n$ sudo dnf5 install firefox\\n```\\n\\nand not bother about anything else. I don\'t want to open browser to look the\\nthing up, download it and then click mindlessly 500\xd7 \u201cNext\u201d. I just want to run\\none command and when the maintainers decide it\'s time to move on, another one to\\nupgrade the software to the newer version.\\n\\nOf course, for some use cases you want to minimize the latter. And even make\\nsure that it\'s safe to do it when you need to. You don\'t want to break your\\nproduction deployment just because someone decided it\'s time to push something\\nout.\\n\\nThat\'s when the _maintainers_ come in. They take upon themselves the\\nresponsibility of maintaining the packages. If you\'ve ever used the Debian, you\\nknow very well how _old_ the software is, but that\'s what you might need for\\nyour servers.\\n\\n## Pain of packaging\\n\\nPackaging software _is not_ cost-free. You may as well have 80 % of packages\\nthat don\'t need much care and it\'s rather easy to push them forward, but those\\nremaining, which are complicated and raise issues regularly, will make it up and\\ntake a lot of time and also pain.\\n\\nLibraries are the most common example that might not need much work to be done.\\nOn the other hand, Linux kernel itself is a rather complicated machinery that\\nis patched **a lot** and its build process is not simple either.\\n\\nEven if you consider just those _easily-maintainble_ packages, the process can\\nbe tedious, boring and overall time consuming.\\n\\n:::tip Shameless RHEL-based ecosystem plug\\n\\n[Packit] can help tremendously with the _easily-maintainable_ packages, since it\\n**can** be automated.\\n\\n:::\\n\\n### Packaging whole ecosystems\\n\\nNow it\'s time to talk about whole ecosystems that have some kind of a packaging\\nby themselves. Yes, I mean Python (with its continuous stream of different\\npackage managers), Rust, Go, etc.\\n\\nWhole point of packaging is to have some form of _gating_. In other words, you\\nwant some kind of _quality control_ when pushing changes into the Linux distros.\\n\\nIf you want to package some tool (or even library) from the aforementioned\\necosystems, you need to package all of the dependencies to make sure something\\ndoesn\'t get updated in the meantime (and also that you can safely reproduce the\\nbuilds, if need be).\\n\\nI\'ve tried to package some utilities for EPEL both in Rust and Go. Dependencies\\nform a DAG[^2] and in case of Rust, it\'s _very_ similar to the way `npm` does\\nits packaging.\\n\\n:::danger Spoiler alert\\n\\nYou get a lot of dependencies. And since it\'s a tree of dependencies, there may\\nbe **a lot** of them.\\n\\n:::\\n\\nI have no clue how do the Rust maintainers operate, but I\'m tipping my fedora in\\ntheir direction, since it must be a _pain in the ass_.\\n\\n## Paid distributions\\n\\nYou can find few Linux distributions that are \u201cpaid\u201d. I\'m very well aware of the\\nfact I\'ve used quotes around the word, cause it\'s not that easy and not even\\nsame for all of the distributions that involve some kind of a payment.\\n\\nOne of the first non-free distributions I\'ve come into contact was _[Zorin OS]_\\nwhich basically tries to be the best _transition_ solution when moving away from\\nthe Windows or macOS. If you have a look at the _perks_ of its _Pro_ version\\nthat\'s paid, you may as well decide they are rather questionable\u2026\\n\\nIt\'s time to move into the _Ubuntu Pro_, _RHEL_ and _SLE_ territory. What\'s the\\npoint of those? They definitely offer different kind of, let\'s say,\\n_non-free experience_.\\n\\nWith those you are paying mainly for the support and bug/security patches.\\n\\n:::tip Fun fact\\n\\nThere\'s no mention of any kind of support on the Zorin page\u2026 Apart from the fact\\nthat _you are supporting_ the Zorin development.\\n\\n:::\\n\\n## Repository structure\\n\\nAs I have mentioned above, the three _services_[^3] I mentioned are providing\\nsupport with regards to bugs and security vulnerabilites. Therefore it makes\\nsense to have some kind of a process in place when you\'re pushing changes\\n(either updates, patches or _security_ patches) to the distribution. And yes,\\nthese processes are _in place_.\\n\\nIf you think about the amount of packages that is present in the community\\ndistributions like _archLinux_ (14,830 packages) or _Fedora_ (74,309 packages),\\nit is safe to come to a conclusion that _there\'s no way_ to support all of them.\\n\\n:::tip archLinux\\n\\nIt may seem that archLinux contains rather small set of packages, but one of the\\n_killer features_ of archLinux lies in the AUR (archLinux User Repository) where\\nyou can find additional **93,283** packages.\\n\\n:::\\n\\nThat\'s why the Linux distributions have some structure to their repositories\\nthat contain packages. The way you go around this is rather simple, you choose\\nsome set of _critical_ packages that you guarantee support for (like Linux\\nkernel, openSSL, etc.) and maintain those with all the QA processes in place.\\n\\n:::caution Unpopular opinion\\n\\nThis is also one of the reasons why I\'m quite against packaging anything and\\neverything into the Linux distribution. In my opinion it is impossible to\\n**properly** maintain **huge** set of packages and enforce some kind of\\n**quality control**.\\n\\n:::\\n\\n### Ubuntu\\n\\nUbuntu has pretty granular structure of their repositories, namely:\\n\\n- `main` containing the \u201ccore\u201d of the Ubuntu that is maintained by the Canonical,\\n- `universe` containing literally the \u201cuniverse\u201d, packages that everyone likes,\\n but they\'re not crucial, this repo is maintained mostly by the community,\\n- `multiverse` containing packages with some license or copyright issues, and\\n- `restricted` containing _proprietary_ packages like nvidia drivers and such.\\n\\nBy briefly checking my Ubuntu 23.10 installation, here are stats of packages in\\ntheir respective repositories:\\n\\n- `main` with 6,128 packages,\\n- `universe` with 63,380 packages,\\n- `multiverse` with 997 packages, and finally\\n- `restricted` with 784 packages.\\n\\nAs you can see, if we sum them up, they are relatively similar to the Fedora\\nnumbers.\\n\\n### CentOS\\n\\nCentOS on the other hand has a bit simpler structure with BaseOS for the base\\nand AppStream for additional packages:\\n\\n- `baseos` with 1,058 packages,\\n- `appstream` with 5,646 packages, and\\n- `extras-common` with 42 packages.\\n\\nOverall they make up the similar number as the Ubuntu\'s `main` repository. And\\nyou can also notice that there are no additional repositories.\\n\\n:::tip\\n\\nThere\'s also a CRB (CodeReady Builder) repository with dev packages like headers\\nand such.\\n\\nAnd you can also enable EPEL (Extra Packages for Enterprise Linux) which is\\ncommunity-supported and provides another 19,903 packages.\\n\\n:::\\n\\n## Ubuntu Pro\\n\\nNow it\'s time to get back to the Ubuntu Pro. There are multiple points that need\\nto be taken in account to be either positive or negative about it\u2026\\n\\nWe can start with the way Ubuntu is released and maintained. Ubuntu has regular\\n6-month release cycle and biannual LTS release. Releases are normally supported\\nfor 9 months with the exception of the LTS releases being supported for 5 years.\\n\\nIf you check out the _[Ubuntu Pro]_ website, you can find the following\\nstatement:\\n\\n> **Ubuntu Pro**\\n>\\n> The most comprehensive subscription for open-source software security\\n>\\n> 30-day trial for enterprises. Always free for personal use.\\n\\n:::tip Personal use\\n\\nUbuntu Pro for _personal use_ consists of 5 installations and in case of the\\ncommunity _ambassadors_ 50.\\n\\n:::\\n\\nOverall if you try to find what is included in the Ubuntu Pro:\\n\\n- high and critical patches,\\n- 10 years of maintenance, and\\n- (optional) 24/7 enterprise-grade support.\\n\\nIf we get back to the screenshot all the way at the beginning of the post:\\n[![Mastodon post about Ubuntu Pro](https://i.imgur.com/mh5RAlV.png)](https://hackers.town/@antijingoist/111864760073049505)\\n\\nand try to look up to which repository the packages mentioned in the screenshot\\nbelong, we will find out that they belong to `universe` repository which is\\nmaintained by the community. Not to mention nature of the packages: multimedia.\\n\\nYou may think about this as a scam, but considering repository consisting of 70k\\npackages, it is not an easy task to do. And with LTS releases we\'re talking\\nabout 5+ years of support.\\n\\n:::info Fedora\\n\\nTry to compare this state to Fedora. It also has a 6-month release cycle, but\\nthere are no LTS releases and each release is supported only for a year.\\n\\n:::\\n\\nCommon strategy, at this point, is to pull out the _open-source_. Yes, we are\\nstill dealing with the open-source, but keep in mind that you\'re trying to patch\\nsome issue in a version that\'s 5 years old, upstream definitely doesn\'t care\\nanymore[^4], the development didn\'t stop 5 years ago, it\'s going on and fixing\\nthis issue in a release from 5 years is not the same as fixing it in the current\\nrelease. At this point, if you are paying for such support, you are actually\\npaying for someone to do _software archaeology_ which **can be** _non-trivial_\\nto do.\\n\\nIn the case of Ubuntu Pro we\'re talking about community support and best-effort\\nsupport by Canonical for the paying customers. And that makes sense to me,\\nrunning LTS distro for 5+ years on a desktop seems like an odd choice, even\\nwith the help of _[podman]_ and _[distrobox]_ or _[toolbx]_ that allow us to use\\nstable or LTS distro as a base and containerized development environments on top\\nof that.\\n\\n## RHEL ecosystem\\n\\nRHEL ecosystem is much more complicated in this matter. However it\'s very\\nsimilar to the way SUSE operates with few exceptions.\\n\\nYou can see a flow diagram here:\\n\\n```mermaid\\nflowchart LR;\\n U[upstream] --\x3e FR[Fedora Rawhide];\\n FR --\x3e F[Fedora release];\\n F --\x3e C[CentOS Stream];\\n C --\x3e R[RHEL];\\n```\\n\\nKey things to take and not to take from the flow diagram:\\n\\n- getting from one upstream to its respective downstream is not as simple as the\\n presence of an arrow and it\'s not the same process for all of them\\n- lengths of the arrows are not proportional, specifically:\\n - Fedora Rawhide is _supposed to_ consume updates as soon as possible,\\n - depending on the decision of the maintainer they can, but _don\'t have to_ be\\n included in the currently supported Fedora releases (you can take [Emacs] as\\n an example of such package), but Rawhide eventually becomes the next Fedora\\n release,\\n - CentOS Stream gets branched off a specific Fedora release, and then\\n - ultimately CentOS Stream becomes the next **minor** release of RHEL.\\n- this diagram is simplified by **a lot**\\n\\n:::tip SUSE flow for comparison\\n\\nI\'ll also include a SUSE flow, so you can compare:\\n\\n```mermaid\\nflowchart LR;\\n U[upstream] --\x3e T[openSUSE Tumbleweed];\\n T --\x3e L[openSUSE Leap];\\n L --\x3e S[SUSE Linux Enterprise];\\n S --\x3e L;\\n```\\n\\nYou can notice, as opposed to the RHEL ecosystem, some changes are being\\nbackported to the openSUSE Leap.\\n\\nHowever this is subject to change as there is a new [ALP] project arising which\\nis, more than likely, going to replace the Leap.\\n\\n:::\\n\\n### Change in the model\\n\\nThe flow I\'ve shown above is in effect since late \u201820 and early \u201821. I hope you\\ncan see that it is quite similar to the way SUSE operates too. Before late \u201820\\nthe flow was following:\\n\\n```mermaid\\nflowchart LR;\\n U[upstream] --\x3e FR[Fedora Rawhide];\\n FR --\x3e F[Fedora release];\\n F --\x3e R[RHEL];\\n R --- C[CentOS];\\n```\\n\\nCentOS was the last distribution in that \u201cchain\u201d. This provides some benefits\\nand some negatives.\\n\\n#### Before the change\\n\\nFrom the point of a developer, unless you have some kind of an early access to\\nRHEL, you don\'t see the changes until they land and are already released. This\\nimpairs your ability to test and verify your software before shipping it to your\\nclients that use RHEL.\\n\\nFrom the point of a user, there is one positive, you basically get \u201cfree RHEL\u201d\\nwithout the support. This also allowed you to report bugs against the RHEL,\\nsince they were 1:1 distros (minus the branding and support). So you\'d\\ntechnically get RHEL free of charge.\\n\\nBenefit of such project, except for the cost, is questionable. The main issue,\\nwhich actually became even more apparent after changing the flow, is someone\\nelse repackaging your own product and selling it again.\\n\\n#### After the change\\n\\nFirst of all, the current flow counters the issue mentioned above. You can test\\nyour projects against the _next minor RHEL release_. CentOS Stream is free, so\\nyou can freely incorporate it into your CI pipelines.\\n\\n:::tip Shameless plug pt. 2\\n\\nAgain, [Packit] can help you on upstream to verify that you\'re not breaking your\\nRPM builds and on top of that you can also use [Testing Farm] to run tests on a\\nspecific Fedora or CentOS Stream releases.\\n\\n> Green tests may not be green everywhere and catching such issues as soon as\\n> possible costs much less than catching them further down the chain.\\n\\n:::\\n\\nThere are many people thinking that RHEL has become closed-source. It is not.\\nThe development happens _out in the open_, it\'s more open that it was before.\\nHowever with the cost of not getting the exact same thing for free. You can get\\nthe next minor RHEL, not the same that\'s normally paid for. [Packit] is an\\nexample of a service that is deployed on the CentOS 9 Stream and even used to be\\ndeployed on Fedora, but the regular 6-month release cycle caused some minor\\nissues here and there.\\n\\n_Production-ready_ is something that heavily depends on the context\u2026\\n\\n:::tip Free \u201cclones\u201d\\n\\nAfter this change so-called _free \u201cclones\u201d_ emerged. I have to admit that in\\ncase of _[AlmaLinux]_ I can see some benefits e.g., pushing for live images and\\nsupport of various desktop environments, Raspberry Pi support or even WSL images\\nbeing present in the M$ Store and easy to install.\\n\\n:::\\n\\n## Open-source and paid support\\n\\nOverall I don\'t think that paying for the support of 5 years old _non-critical_\\npackages is going against the open-source. It is a non-trivial work that, in\\nmajority of cases, cannot be included in the upstream, therefore the benefit is\\nreapt only by the paying customers. I have to admit that in the case of the\\nUbuntu Pro it may seem a bit weird (hiding patches behind the paywall). However\\nwe\'re still talking about rather big set of packages that will affect a minority\\nof server workloads, if any.\\n\\n## Glossary\\n\\n- _rolling release_ - continuously released without \u201csignificant milestones\u201d\\n\\n :::tip\\n\\n As an example of rolling distribution you can take archLinux, openSUSE\\n Tumbleweed, Fedora Rawhide, or even CentOS 9 Stream.\\n\\n As en example of **not** rolling distribution you can take Ubuntu, openSUSE\\n Leap or Fedora.\\n\\n :::\\n\\n- _bleeding edge_ - contains the latest versions as they are released on the\\n upstream\\n\\n :::tip\\n\\n As an example you can take archLinux, openSUSE Tumbleweed or Fedora Rawhide.\\n You can also notice how common it is to combine _rolling release_ with\\n _bleeding edge_.\\n\\n :::\\n\\n- _upstream_ & _downstream_\\n\\n You\'re most likely to meet these terms in the meaning of upstream being the\\n project itself and downstream being the packaging of said project in some\\n distribution.\\n\\n However this can also apply to distributions like _openSUSE Tumbleweed_ with\\n _openSUSE Leap_, _Fedora_ with _CentOS Stream_, or even _CentOS Stream_ with\\n _RHEL_. This basically means that the packages/software is being released into\\n the upstream (Tumbleweed, Fedora, or even CentOS) and then after being tested\\n is taken further down into their respective downstreams (Leap, CentOS, RHEL).\\n\\n[almalinux]: https://almalinux.org/\\n[alp]: https://susealp.io/\\n[distrobox]: https://distrobox.it/\\n[emacs]: https://src.fedoraproject.org/rpms/emacs/\\n[packit]: https://packit.dev/\\n[podman]: https://podman.io/\\n[testing farm]: https://docs.testing-farm.io/Testing%20Farm/0.1/index.html\\n[toolbx]: https://containertoolbx.org/\\n[ubuntu pro]: https://ubuntu.com/pro/\\n[zorin os]: https://zorin.com/os/pro/\\n\\n[^1]: Richard Stallman\\n[^2]: directed acyclic graph\\n[^3]:\\n Ubuntu Pro is technically a service whereas the RHEL and SLE are distros\\n with the support included.\\n\\n[^4]:\\n There are upstream projects that keep LTS branches, such as Linux kernel,\\n but even in the case of the kernel itself, they\'re planning on ending it,\\n since the cost outweighs the benefits at this point."},{"id":"/2024/01/28/rust-opinion","metadata":{"permalink":"/blog/2024/01/28/rust-opinion","editUrl":"https://github.com/mfocko/blog/tree/main/blog/2024-01-28-rust-opinion.md","source":"@site/blog/2024-01-28-rust-opinion.md","title":"Mixed feelings on Rust","description":"Discussing my mixed feelings about the Rust language.\\n","date":"2024-01-28T00:00:00.000Z","formattedDate":"January 28, 2024","tags":[{"label":"rust","permalink":"/blog/tags/rust"},{"label":"memory safety","permalink":"/blog/tags/memory-safety"},{"label":"cult","permalink":"/blog/tags/cult"},{"label":"hype","permalink":"/blog/tags/hype"}],"readingTime":15.395,"hasTruncateMarker":true,"authors":[{"name":"Matej Focko","email":"me+blog@mfocko.xyz","title":"a.k.a. passionate language hater","url":"https://gitlab.com/mfocko","imageURL":"https://github.com/mfocko.png","key":"mf"}],"frontMatter":{"title":"Mixed feelings on Rust","description":"Discussing my mixed feelings about the Rust language.\\n","date":"2024-01-28T00:00:00.000Z","authors":[{"key":"mf","title":"a.k.a. passionate language hater"}],"tags":["rust","memory safety","cult","hype"],"hide_table_of_contents":false},"unlisted":false,"prevItem":{"title":"LTS distributions","permalink":"/blog/2024/02/07/lts-distros"},"nextItem":{"title":"How can Copr help with broken dependencies","permalink":"/blog/2023/08/02/copr"}},"content":"Rust has become a rather popular language these days. I\'ve managed to get my\\nhands dirty with it during _[Advent of Code]_ \u201822 and partially \u201823. I\'ve also\\nused it for few rounds of _[Codeforces]_ and I have to try very hard to maintain\\nsome variety of languages for LeetCode challenges along with the Rust. I\'ll\\ndisclaim up front that I won\'t be only positive, since this post is a result of\\nmultiple discussions about Rust and I stand by\\n_\u201cAll that glitters is not gold\u201d_, so if you can\'t stand your favorite language\\nbeing criticized in any way, don\'t even proceed. :wink:\\n\\n\x3c!--truncate--\x3e\\n\\n## Memory safety\\n\\nI\'ll start by kicking the biggest benefit of the language, the memory safety.\\nLet\'s be honest here, majority of the checks rely on the static analysis, cause\\nyou can\'t do anything else during the compile-time, right? Therefore we can\\nbasically say that we are relying on the compiler to \u201csolve\u201d all of our issues.\\n\\n:::warning\\n\\nI\'m not doubting the fact that compiler can prevent **a lot** of the memory\\nerrors, I\'m just saying it\'s not realistic to cover **everything**.\\n\\n:::\\n\\n### Compiler\\n\\nI guess we can safely[^2] agree on the fact that we 100% rely on the compiler to\\n_have our back_. Is the compiler bug-free? I doubt it. This is not meant in an\\noffensive way to the Rust compiler developers, but we need to be realistic here.\\nIt\'s a compiler, even older and larger projects like _gcc_ or _llvm_ can\'t avoid\\nbugs to appear.\\n\\nWhen I was trying out Rust for some of the LeetCode challenges I\'ve stumbled\\nupon the following warning:\\n![Example of a compiler bug](https://i.imgur.com/NfPLF6o.png)\\n\\n:::danger [Issue](https://github.com/rust-lang/rust/issues/59159)\\n\\nThe issue here comes from the fact that we have 2 simultaneous references to the\\nsame memory (one is mutable and one immutable). If you cannot think of any way\\nthis can break, I\'ll give you a rather simple example from C++ where this could\\ncause an issue.\\n\\nImagine a function that has some complex object and also calls a coroutine which\\nutilizes read-only reference to that object. When the coroutine suspends, the\\ncaller can modify the object. This can break the integrity of data read by the\\ncoroutine.\\n\\n- Yes, this **can** cause a memory error.\\n- Yes, this **hasn\'t** been handled until someone noticed it.\\n\\nFixing this bug is not backwards compatible, cause you\'re covering a case that\\nhasn\'t been covered before.\\n\\n:::\\n\\n### Enforcing the safety\\n\\nOne of the ways Rust enforces the safety is by restricting what you can do, like\\nthe example above. Aforementioned issue _can_ happen, but **doesn\'t have to**.\\nRule of the thumb in the Rust compiler is to _\u201cblock\u201d_ anything that can be an\\nissue, static analysis can\'t do much more, it cannot decide whether it\'s safe to\\ndo it or not.\\n\\nSatisfying the Rust compiler is sometimes a brutal pain in the ass, because you\\ncannot do things like you\'re used to, you need to work around them _somehow_.\\n\\n:::tip\\n\\nKey difference between Rust and C or C++ lies in the fact that Rust chooses to\\n_ban_ all \u201cpotentially offensive\u201d actions, C and C++ _relies_ on **you** to be\\nsure it\'s safe to do.\\n\\n![C++ v. Rust](https://i.imgur.com/0vbkYPp.png)\\n\\n:::\\n\\n### Consequences\\n\\nWhere are we heading with this approach of \u201cif it compiles, it runs\u201d though?\\nIn this aspect I have a rather similar opinion as with regards to the ChatGPT\\nand its derivatives.\\n\\nIf you teach people to 100% depend on the compiler, they will do it, cause it\'s\\n_easy_. All you need to do is make the compiler _shut up_[^3]. Giving up the\\n_intellectual masturbation_ about the memory safety will make you lose your edge\\nover the time. When we get to the point of everyone being in the mindset\\nmentioned above, who\'s going to maintain the compiler? This is the place where\\nyou **need to** think about the memory safety and furthermore in a much more\\ngeneral way than in your own projects, because it is the thing that everyone\\n_blindly believes in_ in the end.\\n\\nI\'m not saying that everyone should give up Rust and think about their memory\\nmanagement and potential memory issues. I\'m just saying that going the easy way\\nwill make people _dull_ and they should think about it anyways, that\'s how the\\nissue above has been discovered. If everyone walked past and didn\'t think about\\nit, no one would discover this issue till it bit them hard.\\n\\n:::tip Standard library\\n\\nEven the standard library is littered with `unsafe` blocks that are prefixed\\nwith comments in style:\\n\\n```rs\\n// SAFETY: \u2026\\n```\\n\\nThe fact that the _casual_ Rust dev doesn\'t have to think much about safety,\\ncause the compiler has their back, doesn\'t mean that the Rust compiler dev\\ndoesn\'t either.\\n\\nI gotta admit that I adopted this concept in other languages (even in Python),\\ncause you can encounter situations where it doesn\'t have to be clear _why_ you\\ncan do _what_ you\'re doing.\\n\\n:::\\n\\n## Development & design\\n\\nDevelopment of Rust is\u2026 very fast. One positive is that they\'re trying to be as\\nbackward compatible as possible at least by verifying against all the published\\ncrates in the process. Of course, you cannot be backward compatible about fixing\\nthe bugs that have been found, but such is life.\\n\\n### Fast development cycle\\n\\nOne of the negatives of the fast development cycle is the fact that they\'re\\nusing the latest features already in the next release of the Rust. Yes, it is\\nsomething that you can use for verifying and testing your own changes, but at\\nthe same time it places a requirement of the latest release to compile the next\\none.\\n\\n:::tip\\n\\nIf you check `gcc` for example, they have a requirement of minimal version of\\ncompiler that you need for the build. Though gcc\'s requirement is not so _needy_\\nas the Rust one.\\n\\n:::\\n\\nOne of the other negatives is the introduction of bugs. If you\'re pushing\\nchanges, somewhat mindlessly, at such a fast pace, it is inevitable to introduce\\na bunch bugs in the process. Checking the GitHub issue tracker with\\n\\n```\\nis:issue is:open label:C-bug label:T-compiler\\n```\\n\\nyields **2,224** open issues at the time of writing this post.\\n\\n### RFCs\\n\\nYou can find **a lot** of RFCs for the Rust. Some of them are more questionable\\nthan the others. Fun thing is that a lot of them make it to the nightly builds,\\nso they can be tested and polished off. Even the questionable ones\u2026 I\'ll leave\\nfew examples for a better understanding.\\n\\nOne of such features is the `do yeet` expression:\\n\\n```rust\\n#![feature(yeet_expr)]\\n\\nfn foo() -> Result {\\n do yeet 4;\\n}\\nassert_eq!(foo(), Err(4));\\n\\nfn bar() -> Option {\\n do yeet;\\n}\\nassert_eq!(bar(), None);\\n```\\n\\nIt allows you to \u201cyeet\u201d the errors out of the functions that return `Result` or\\n`Option`.\\n\\n[One](https://github.com/rust-lang/rfcs/pull/3503) of the more recent ones is\\nthe ability to include Cargo manifests into the sources, so you can do something\\nlike:\\n\\n```rust\\n#!/usr/bin/env cargo\\n---\\n[dependencies]\\nclap = { version = \\"4.2\\", features = [\\"derive\\"] }\\n---\\n\\nuse clap::Parser;\\n\\n#[derive(Parser, Debug)]\\n#[clap(version)]\\nstruct Args {\\n #[clap(short, long, help = \\"Path to config\\")]\\n config: Option,\\n}\\n\\nfn main() {\\n let args = Args::parse();\\n println!(\\"{:?}\\", args);\\n}\\n```\\n\\nI would say you can get almost anything into the language\u2026\\n\\n## Community and hype train\\n\\nRust community is a rather unique thing. A lot of people will hate me for this,\\nbut I can\'t help, but to compare them to _militant vegans_. I\'ll go through some\\nof the things related to it, so I can support my opinion at least.\\n\\n_Rust is the best language._ It is not. There is no best language, each has its\\nown positives and negatives, you need to choose the language that\'s **the most**\\n**suitable for your use case**. There are areas where Rust excels, though I have\\nto admit it\'s very close to being a universal hammer regardless of how suitable\\nit is. There is a very steep learning curve to it, beginnings in Rust are very\\npainful.\\n\\n_Rewrite everything in Rust._ Just no. There are multiple feedbacks on doing\\nrewrites, it is very common to fix _N_ bugs with a rewrite while introducing\\n_N + 1_ other bugs in the process. It doesn\'t solve anything unless there are\\nsome strong reasons to go with it. Majority of such suggested rewrites don\'t\\nhave those reasons though.\\n\\n_Language \u2039x\u203a is bad, though in Rust\u2026_ Cherry-picking one specific pain point of\\none language and reflecting how it is better in other language can go both ways.\\nFor example it is rather easy to pick the limitations imposed by Rust compiler\\nand show how it\'s possible in other languages :man_shrugging:\\n\\nI don\'t mind any of those opinions, you\'re free to have them, as long as you\\ndon\'t rub them in my face which is not the usual case\u2026 This experience makes it\\njust worse for me, part of this post may be also influenced by this fact.\\n\\n### Rust in Linux\\n\\n:::caution\\n\\nAs someone who has seen the way Linux kernel is built in the RHEL ecosystem, how\\ncomplex the whole thing is and how much resources you need to proceed, I have\\nvery strong opinions on this topic.\\n\\n:::\\n\\nIt took years of work to even \u201cincorporate\u201d Rust into the Linux codebase, just\\nto get the \u201cHello World!\u201d. I don\'t have anything against the idea of writing\\ndrivers in the Rust, I bet it can catch a lot of common mistakes, but still\\nintroducing Rust to the kernel is another step to enlarge the monster.\\n\\nI have to admit though that the _Apple GPU_ driver for Linux written in Rust is\\nquite impressive. Apart from that there are not so many benefits, yet\u2026\\n\\n## Packaging\\n\\nI\'ll divide the packaging into the packaging of the language itself and the\\nprograms written in Rust.\\n\\nLet\'s start with the `cargo` itself though. Package managers of the languages\\nusually get a lot of hate (you can take `npm` or `pip` as examples[^1]). If\\nyou\'ve ever tried out Rust, I bet you already know where I\'m going with this.\\nYes, I mean the compilation times, or even Cargo downloading _whole_ index of\\ncrates just so you can update that one dependency (and 3 millions of indirect\\ndeps). When I was doing AoC \u201822 in Rust, I\'ve set up `sccache` right away on the\\nfirst day.\\n\\nLet\'s move to the packaging of the Rust itself, it\'s tedious. Rust has a very\\nfast development cycle and doesn\'t even try to make the builds backward\\ncompatible. If there is a new release of Rust, there is a very high chance that\\nyou cannot build that release with anything other than **the latest** Rust\\nrelease. If you have ever touched the packaging, you know that this is something\\nthat can cause a lot of problems, cause you need the second-to-latest version to\\ncompile the latest version, don\'t forget that this applies inductively\u2026 People\\nrunning _Gentoo_ could tell you a lot about this.\\n\\n:::info\\n\\nCompiling the compilers takes usually more time than compiling the kernel\\nitself\u2026\\n\\n:::\\n\\nI cannot speak about packaging of Rust programs in other than RHEL-based\\ndistros, though I can speak about RHEL ecosystem. Fedora packaging guidelines\\nspecify that you need to build each and every dependency of the program\\nseparately. I wanted to try out _AlmaLinux_ and install Alacritty there and I\\nfailed miserably. The solution that worked, consisted of ignoring the packaging\\nguidelines, running `cargo build` and consuming the binaries afterwards.\\nDependencies of the Rust programs are of a similar nature as JS dependencies.\\n\\n> I\'m tipping my fedora[^2] in the general direction of the maintainers of Rust\\n> packages in RHEL ecosystem. I wouldn\'t be able to do this without losing my\\n> sanity.\\n\\n## Likes\\n\\nIf you\'ve come all the way here and you\'re a Rustacean, I believe I\'ve managed\\nto get your blood boiling, so it\'s time to finish this off by stuff I like about\\nRust. I doubt I will be able to cover everything, but I can try at least. You\\nhave to admit it\'s much easier to remember the bad stuff as opposed to the good.\\n:wink:\\n\\n### Workflow and toolchain\\n\\nI prefered using Rust for the _Advent of Code_ and _Codeforces_ as it provides\\na rather easy way to test the solutions before running them with the challenge\\ninput (or test runner). I can give an example from the _Advent of Code_:\\n\\n```rust\\nuse aoc_2023::*;\\n\\ntype Output1 = i32;\\ntype Output2 = Output1;\\n\\nstruct DayXX {}\\nimpl Solution for DayXX {\\n fn new>(pathname: P) -> Self {\\n let lines: Vec = file_to_lines(pathname);\\n\\n todo!()\\n }\\n\\n fn part_1(&mut self) -> Output1 {\\n todo!()\\n }\\n\\n fn part_2(&mut self) -> Output2 {\\n todo!()\\n }\\n}\\n\\nfn main() -> Result<()> {\\n DayXX::main()\\n}\\n\\ntest_sample!(day_XX, DayXX, 42, 69);\\n```\\n\\nThis was the skeleton I\'ve used and the macro at the end is my own creation that\\nexpands to:\\n\\n```rust\\n#[cfg(test)]\\nmod day_XX {\\n use super::*;\\n\\n #[test]\\n fn part_1() {\\n let path = DayXX::get_sample(1);\\n let mut day = DayXX::new(path);\\n assert_eq!(day.part_1(), 42);\\n }\\n\\n #[test]\\n fn part_2() {\\n let path = DayXX::get_sample(2);\\n let mut day = DayXX::new(path);\\n assert_eq!(day.part_2(), 69);\\n }\\n}\\n```\\n\\nWhen you\'re solving the problem, all you need to do is switch between\\n`cargo test` and `cargo run` to check the answer to either sample or the\\nchallenge input itself.\\n\\nIntroduce [bacon] and it gets even better. Bacon is a CLI tool that wraps around\\nthe `cargo` and allows you to check, run, lint or run tests on each file save.\\nIt\'s a very pleasant thing for a so-called _compiler-assisted_ development.\\n\\nSpeaking of linting from within the bacon, you cannot leave out the [clippy].\\nNot only it can whip your ass because of errors, but it can also produce a lot\\nof helpful suggestions, for example passing slices by borrow instead of\\nborrowing the `Vec` itself when you don\'t need it.\\n\\n### Standard library\\n\\nThere\'s **a lot** included in the standard library. It almost feels like you\\nhave all you need[^4]. I like placeholders (like `todo!()`, `unreachable!()`,\\n`unimplemented!()`) to the extent of\\n[implementing](/cpp/exceptions-and-raii/placeholders) them as exceptions in C++.\\n\\nYou can find almost anything. Though you can also hit some very weird issues\\nwith some of the nuances of the type system.\\n\\n### `unsafe`\\n\\nThis might be something that people like to avoid as much as possible. However I\\nthink that forming a habit of commenting posibly unsafe operations in **any**\\nlanguage is a good habit, as I\'ve mentioned above. You should be able to argue\\nwhy you can do something safely, even if the compiler is not kicking your ass\\nbecause of it.\\n\\nExcerpt of such comment from work:\\n\\n```py\\n# SAFETY: Taking first package instead of specific package should be\\n# safe, since we have put a requirement on \xbbone\xab \u2039upstream_project_url\u203a\\n# per Packit config, i.e. even if we\'re dealing with a monorepo, there\\n# is only \xbbone\xab upstream. If there is one upstream, there is only one\\n# set of GPG keys that can be allowed.\\nreturn self.downstream_config.packages[\\n self.downstream_config._first_package\\n].allowed_gpg_keys\\n```\\n\\n### Traits\\n\\nOne of the other things I like are the traits. They are more restrictive than\\ntemplates or concepts in C++, but they\'re doing their job pretty good. If you\\nare building library and require multiple traits to be satisfied it means a lot\\nof copy-paste, but that\'s soon to be fixed by the [trait aliases].\\n\\n:::tip Comparing to other languages\\n\\nOn Wikipedia I\'ve seen trait being defined as a more restrictive type class as\\nyou may know it from the Haskell for example. C++ isn\'t behind either with its\\n_constraints and concepts_. I would say that we can order them in the following\\norder based on the complexity they can express:\\n\\n```\\nRust\'s trait < Haskell\'s type class < C++\'s concept\\n```\\n\\n:::\\n\\nYou can also hit some issues, like me when trying to support conversions between\\nunderlying numeric types of a 2D vectors or support for using an operator from\\nboth sides (I couldn\'t get `c * u` to work in the same way as `u * c` because\\nthe first one requires you to implement the trait of a built-in type).\\n\\n:::warning Implementation\\n\\nImplementing traits lies in\\n\\n```rust\\nimpl SomeTrait for SomeStruct {\\n // implementation goes here\\n}\\n```\\n\\nOne of the things I **would love to** see is being able to define the helper\\nfunctions within the same block. As of now, the only things allowed are the ones\\nthat are required by the trait, which in the end results in a randomly lying\\nfunctions around (or in a implementation of the structure itself). I don\'t like\\nthis mess at all\u2026\\n\\n:::\\n\\n### Influence of functional paradigm\\n\\nYou can see a big influence of the functional paradigm. Not only in iterators,\\nbut also in the other parts of the language. For example I prefer `Option` or\\n`Result` to `null`s and exceptions. Pattern matching together with\\ncompiler both enforces handling of the errors and rather user-friendly way of\\ndoing it.\\n\\nNot to mention `.and_then()` and such. However spending most of the time with\\nthe AoC you get pretty annoyed of the repetitive `.unwrap()` during parsing,\\nsince you are guaranteed correct input.\\n\\n### Macros\\n\\nMacros are a very strong pro of the Rust. And no, we\'re not going to talk about\\nthe procedural macros\u2026\\n\\nAs I\'ve shown above I\'ve managed to \u201ctame\u201d a lot of copy-paste in the tests for\\nthe AoC by utilizing a macro that generated a very basic template for the tests.\\n\\nAs I have mentioned the traits above, I cannot forget to give props to `derive`\\nmacro that allows you to \u201cdeduce\u201d the default implementation. It is very helpful\\nfor a tedious tasks like implementing `Debug` (for printing out the structures)\\nor comparisons, though with the comparisons you need to be careful about the\\ndefault implementation, it has already bitten me once or twice.\\n\\n## Summary\\n\\nOverall there are many things about the Rust I like and would love to see them\\nimplemented in other languages. However there are also many things I don\'t like.\\nNothing is **exclusively** black and white.\\n\\n[advent of code]: https://adventofcode.com\\n[bacon]: https://dystroy.org/bacon/\\n[clippy]: https://github.com/rust-lang/rust-clippy\\n[codeforces]: https://codeforces.com\\n[trait aliases]: https://github.com/rust-lang/rfcs/blob/master/text/1733-trait-alias.md\\n\\n[^1]:\\n not to even mention multiple different packaging standards Python has, which\\n is borderline https://xkcd.com/927/\\n\\n[^2]: pun intended\\n[^3]: It\'s not that easy with the Rust compiler, but OK\u2026\\n[^4]:\\n unlike Python where there\'s whole universe in the language itself, yet there\\n are essential things not present\u2026"},{"id":"/2023/08/02/copr","metadata":{"permalink":"/blog/2023/08/02/copr","editUrl":"https://github.com/mfocko/blog/tree/main/blog/2023-08-02-copr.md","source":"@site/blog/2023-08-02-copr.md","title":"How can Copr help with broken dependencies","description":"Copr comes to save you when maintainer doesn\'t care.","date":"2023-08-02T00:00:00.000Z","formattedDate":"August 2, 2023","tags":[{"label":"\ud83c\udfed","permalink":"/blog/tags/\ud83c\udfed"},{"label":"red-hat","permalink":"/blog/tags/red-hat"},{"label":"copr","permalink":"/blog/tags/copr"},{"label":"admin","permalink":"/blog/tags/admin"},{"label":"vps","permalink":"/blog/tags/vps"}],"readingTime":3.44,"hasTruncateMarker":true,"authors":[{"name":"Matej Focko","email":"me+blog@mfocko.xyz","title":"a.k.a. your opinionated admin","url":"https://gitlab.com/mfocko","imageURL":"https://github.com/mfocko.png","key":"mf"}],"frontMatter":{"title":"How can Copr help with broken dependencies","description":"Copr comes to save you when maintainer doesn\'t care.","date":"2023-08-02T00:00:00.000Z","authors":[{"key":"mf","title":"a.k.a. your opinionated admin"}],"tags":["\ud83c\udfed","red-hat","copr","admin","vps"]},"unlisted":false,"prevItem":{"title":"Mixed feelings on Rust","permalink":"/blog/2024/01/28/rust-opinion"},"nextItem":{"title":"4th week of Advent of Code \'22 in Rust","permalink":"/blog/aoc-2022/4th-week"}},"content":"When you decide to run Fedora on your VPS, you might get screwed over by using\\nrandom repositories\u2026\\n\\n\x3c!--truncate--\x3e\\n\\nWhen I \u201creserved\u201d my VPS[^1] back in June \'20, I slapped Fedora on it without\\nthinking. I bet 99% of people would say that I\'m crazy for doing such thing[^2],\\n**BUT** I\'ve been using Fedora on my PCs for some time already and it felt very\\nstable and natural to just use, even for VPS.\\n\\nOne of the first things I\'ve done was setting up a mail server. You may guess\\nwhat\'s the fun part about having a mail server\u2026 Yes, it\'s all the spam you\\nreceive and only then you realize how much \u201ccrap\u201d gets filtered on free mail\\nservices. To battle this problem I chose to use\\n[rspamd](https://github.com/rspamd/rspamd) that had CentOS support, but someone\\nhad a [Copr](https://copr.fedorainfracloud.org/) repository that I used to\\ninstall it.\\n\\n## How does Copr repositories work?\\n\\nIf you have ever used Ubuntu, you might be familiar with the concept since it is\\nvery close to [PPAs](https://help.ubuntu.com/community/PPA).\\n\\ntl;dr of the whole process consists of\\n\\n1. enabling the Copr repository, and\\n2. installing the desired package.\\n\\nSo in shell you would do\\n\\n```\\n# dnf copr enable \u2039copr-repository\u203a\\n# dnf install \u2039package-from-the-repository\u203a\\n```\\n\\nAnd\u2026 that\'s it! Nothing else needed! Simple, right? And literally same process\\nas you would do for the PPA.\\n\\n:::tip AUR\\n\\nOn the other hand, if you are familiar with the archLinux, you definitely know\\nAUR and what it can do for you. Copr repository is pretty similar, but the\\npackages are prebuilt in Copr and Copr repositories can carry the required\\ndependencies for said packages, which simplifies the distribution, and can even\\nhelp with installing singular packages (when you just need the dependency, not\\neverything).\\n\\n:::\\n\\n## My issue\\n\\nNow you might wonder how would I use it on my VPS. It\'s rather simple, once in\\n6 months a new Fedora release comes out. And you need to upgrade to newer\\nrelease\u2026 You don\'t need to do it right away and for such setup it probably isn\'t\\neven recommended.\\n\\n:::tip\\n\\nFedora releases are supported for a year, i.e. they live 6 months till the next\\nrelease and then another 6 months till another release.\\n\\nSome people prefer to run one version \u201cbehind\u201d. If you ever decide to run it on\\nyour home server or in a similar setup, it might be a pretty good idea to\\nfollow. I\'m using the \u201clatest greatest\u201d, cause why not :smile:\\n\\nOne way or another, you still need to bump the release every six months, unless\\nyou\'d bump 2 releases at once every year, which would be a decision, since, at\\nleast I, cannot see any benefits in it\u2026 You don\'t go for \u201cstability\u201d, cause once\\na year you switch to the latest release and then, before you bump, you use one\\nyear old software, so you\'re not even using the latest.\\n\\n:::\\n\\nFast-forward 2 years in the future, new Fedora release came out (October \'22)\\nand I was doing an upgrade. Dependencies of the rspamd have been updated and\\nrspamd builds in Copr have failed and no one fixed it. Cool, so now I can\\nupgrade, but can either ignore the dependencies or uninstall the rspamd\u2026\\n\\n## How can Copr help?\\n\\nI have managed to find\\n[specfile](https://github.com/rspamd/rspamd/blob/master/rpm/rspamd.spec) for the\\nrspamd package that they use for CentOS. There were some files apart from the\\nspecfile, so I had to make an SRPM locally and then\u2026 I just uploaded the SRPM\\nto the Copr to\\n[build](https://copr.fedorainfracloud.org/coprs/mfocko/rspamd/build/5046567/)\\nan RPM.\\n\\nI have switched the previous Copr repository for rspamd with my own and happily\\nproceeded with the upgrade.\\n\\n## Conclusion\\n\\nCopr is heavily used for testing builds on the upstream with\\n[Packit](https://packit.dev). However, as you can see, it is possible to use it\\n**very well** for packaging your own stuff and avoiding issues (such as the one\\nI have described above), if need be.\\n\\n[^1]: [vpsFree.cz](https://vpsfree.cz)\\n[^2]:\\n Even though I\'ve been running archLinux on some Raspberry Pi\'s and also\\n on one of my \u201chome servers\u201d, before getting the VPS. You could say I like\\n to live on the edge\u2026"},{"id":"aoc-2022/4th-week","metadata":{"permalink":"/blog/aoc-2022/4th-week","editUrl":"https://github.com/mfocko/blog/tree/main/blog/aoc-2022/04-week-4.md","source":"@site/blog/aoc-2022/04-week-4.md","title":"4th week of Advent of Code \'22 in Rust","description":"Surviving fourth week in Rust.","date":"2023-07-07T15:14:00.000Z","formattedDate":"July 7, 2023","tags":[{"label":"advent-of-code","permalink":"/blog/tags/advent-of-code"},{"label":"advent-of-code-2022","permalink":"/blog/tags/advent-of-code-2022"},{"label":"rust","permalink":"/blog/tags/rust"}],"readingTime":15.175,"hasTruncateMarker":true,"authors":[{"name":"Matej Focko","email":"me+blog@mfocko.xyz","title":"a.k.a. @mf","url":"https://gitlab.com/mfocko","imageURL":"https://github.com/mfocko.png","key":"mf"}],"frontMatter":{"title":"4th week of Advent of Code \'22 in Rust","description":"Surviving fourth week in Rust.","date":"2023-07-07T15:14","slug":"aoc-2022/4th-week","authors":"mf","tags":["advent-of-code","advent-of-code-2022","rust"],"hide_table_of_contents":false},"unlisted":false,"prevItem":{"title":"How can Copr help with broken dependencies","permalink":"/blog/2023/08/02/copr"},"nextItem":{"title":"3rd week of Advent of Code \'22 in Rust","permalink":"/blog/aoc-2022/3rd-week"}},"content":"Let\'s go through the fourth week of [_Advent of Code_] in Rust.\\n\\n\x3c!--truncate--\x3e\\n\\n## [Day 22: Monkey Map](https://adventofcode.com/2022/day/22)\\n\\n:::info tl;dr\\n\\nSimulating a movement on a 2D map with given instructions. Map becomes a cube in\\nthe 2nd part\u2026\\n\\n:::\\n\\n:::caution Rant\\n\\nThis was the most obnoxious problem of this year\u2026 and a lot of Rust issues have\\nbeen hit.\\n\\n:::\\n\\n### Solution\\n\\nIt seems like a very simple problem to solve, but with very obnoxious changes in\\nthe 2nd part and also it\'s relatively hard to decompose \xbbproperly\xab.\\n\\n#### Column iterator\\n\\nIn the first part of the problem it was needed to know the boundaries of each\\nrow and column, since I stored them in `Vec>` and padded with spaces\\nto ensure I have a rectangular 2D \u201carray\u201d. However when you wanted to go through\\neach row and column to determine the boundaries, it was very easy to do for the\\nrows (cause each row is a `Vec` element), but not for the columns, since they\\nspan multiple rows.\\n\\nFor this use case I have implemented my own _column iterator_:\\n\\n```rust\\npub struct ColumnIterator<\'a, T> {\\n map: &\'a [Vec],\\n column: usize,\\n\\n i: usize,\\n}\\n\\nimpl<\'a, T> ColumnIterator<\'a, T> {\\n pub fn new(map: &\'a [Vec], column: usize) -> ColumnIterator<\'a, T> {\\n Self { map, column, i: 0 }\\n }\\n}\\n\\nimpl<\'a, T> Iterator for ColumnIterator<\'a, T> {\\n type Item = &\'a T;\\n\\n fn next(&mut self) -> Option {\\n if self.i >= self.map.len() {\\n return None;\\n }\\n\\n self.i += 1;\\n Some(&self.map[self.i - 1][self.column])\\n }\\n}\\n```\\n\\nGiven this piece of an iterator, it is very easy to factor out the common\\nfunctionality between the rows and columns into:\\n\\n```rust\\nlet mut find_boundaries = |constructor: fn(usize) -> Orientation,\\n iterator: &mut dyn Iterator,\\n upper_bound,\\n i| {\\n let mut first_non_empty = iterator.enumerate().skip_while(|&(_, &c)| c == \' \');\\n let start = first_non_empty.next().unwrap().0 as isize;\\n\\n let mut last_non_empty = first_non_empty.skip_while(|&(_, &c)| c != \' \');\\n let end = last_non_empty.next().unwrap_or((upper_bound, &\'_\')).0 as isize;\\n\\n boundaries.insert(constructor(i), start..end);\\n};\\n```\\n\\nAnd then use it as such:\\n\\n```rust\\n// construct all horizontal boundaries\\n(0..map.len()).for_each(|row| {\\n find_boundaries(\\n Orientation::horizontal,\\n &mut map[row].iter(),\\n map[row].len(),\\n row,\\n );\\n});\\n\\n// construct all vertical boundaries\\n(0..map[0].len()).for_each(|col| {\\n find_boundaries(\\n Orientation::vertical,\\n &mut ColumnIterator::new(&map, col),\\n map.len(),\\n col,\\n );\\n});\\n```\\n\\n#### Walking around the map\\n\\nOnce the 2nd part got introduced, you start to think about a way how not to\\ncopy-paste a lot of stuff (I haven\'t avoided it anyways\u2026). In this problem, I\'ve\\nchosen to introduce a trait (i.e. _interface_) for 2D and 3D walker.\\n\\n```rust\\ntrait Wrap: Clone {\\n type State;\\n\\n // simulation\\n fn is_blocked(&self) -> bool;\\n fn step(&mut self, steps: isize);\\n fn turn_left(&mut self);\\n fn turn_right(&mut self);\\n\\n // movement\\n fn next(&self) -> (Self::State, Direction);\\n\\n // final answer\\n fn answer(&self) -> Output;\\n}\\n```\\n\\nEach walker maintains its own state and also provides the functions that are\\nused during the simulation. The \u201cpromised\u201d methods are separated into:\\n\\n- _simulation_-related: that are used during the simulation from the `.fold()`\\n- _movement_-related: just a one method that holds most of the logic differences\\n between 2D and 3D\\n- _final answer_: which extracts the _proof of solution_ from the\\n implementation-specific walker\\n\\nBoth 2D and 3D versions borrow the original input and therefore you must\\nannotate the lifetime of it:\\n\\n```rust\\nstruct Wrap2D<\'a> {\\n input: &\'a Input,\\n position: Position,\\n direction: Direction,\\n}\\nimpl<\'a> Wrap2D<\'a> {\\n fn new(input: &\'a Input) -> Wrap2D<\'a> {\\n// \u2026\\n```\\n\\n#### Problems\\n\\nI have used a lot of closures for this problem and once I introduced a parameter\\nthat was of unknown type (apart from the fact it implements a specific trait), I\\ngot suggested a \u201cfix\u201d for the compilation error that resulted in something that\\nwas not possible to parse, cause it, more than likely, violated the grammar.\\n\\nIn a similar fashion, I have been suggested changes that led to a code that\\ndidn\'t make sense by just looking at it (there was no need to try the changes),\\nfor example one suggested change in the closure parameter caused disapperance of\\nthe parameter name. :smile:\\n\\n#### Clippy\\n\\nI have to admit that Clippy was rather helpful here, I\'ll include two examples\\nof rather smart suggestions.\\n\\nWhen writing the parsing for this problem, the first thing I have spotted on the\\n`char` was the `.is_digit()` function that takes a radix as a parameter. Clippy\\nnoticed that I use `radix = 10` and suggested switching to `.is_ascii_digit()`\\nthat does exactly the same thing:\\n\\n```diff\\n- .take_while(|c| c.is_digit(10))\\n+ .take_while(|c| c.is_ascii_digit())\\n```\\n\\nAnother useful suggestion appeared when working with the iterators and I wanted\\nto get the $n$-th element from it. You know the `.skip()`, you know the\\n`.next()`, just \u201cslap\u201d them together and we\'re done for :grin: Well, I got\\nsuggested to use `.nth()` that does exactly the combination of the two mentioned\\nmethods on iterators:\\n\\n```diff\\n- match it.clone().skip(skip).next().unwrap() {\\n+ match it.clone().nth(skip).unwrap() {\\n```\\n\\n## [Day 23: Unstable Diffusion](https://adventofcode.com/2022/day/23)\\n\\n:::info tl;dr\\n\\nSimulating movement of elves around with a set of specific rules.\\n\\n:::\\n\\n### Solution\\n\\nThere\'s not much to mention since it\'s just a cellular automaton simulation\\n(even though the AoC rules for cellular automatons usually get out of hand\\n:wink:).\\n\\nAlthough I had a need to determine boundaries of the elves\' positions and ended\\nup with a nasty DRY violation. Knowing that you you\'re looking for maximum and\\nminimum that are, of course, exactly the same except for initial values and\\ncomparators, it looks like a rather simple fix, but typing in Rust is something\\nelse, right? In the end I settled for a function that computes both boundaries\\nwithout any duplication while using a closure:\\n\\n```rust\\nfn get_bounds(positions: &Input) -> (Vector2D, Vector2D) {\\n let f = |init, cmp: &dyn Fn(isize, isize) -> isize| {\\n positions\\n .iter()\\n .fold(Vector2D::new(init, init), |acc, elf| {\\n Vector2D::new(cmp(acc.x(), elf.x()), cmp(acc.y(), elf.y()))\\n })\\n };\\n\\n (f(isize::MAX, &min::), f(isize::MIN, &max::))\\n}\\n```\\n\\nThis function returns a pair of 2D vectors that represent opposite points of the\\nbounding rectangle of all elves.\\n\\nYou might ask why would we need a closure and the answer is that `positions`\\ncannot be captured from within the nested function, only via closure. One more\\nfun fact on top of that is the type of the comparator\\n\\n```rust\\n&dyn Fn(isize, isize) -> isize\\n```\\n\\nOnce we remove the `dyn` keyword, compiler yells at us and also includes a way\\nhow to get a more thorough explanation of the error by running\\n\\n $ rustc --explain E0782\\n\\nwhich shows us\\n\\n Trait objects must include the `dyn` keyword.\\n\\n Erroneous code example:\\n\\n ```\\n trait Foo {}\\n fn test(arg: Box) {} // error!\\n ```\\n\\n Trait objects are a way to call methods on types that are not known until\\n runtime but conform to some trait.\\n\\n Trait objects should be formed with `Box`, but in the code above\\n `dyn` is left off.\\n\\n This makes it harder to see that `arg` is a trait object and not a\\n simply a heap allocated type called `Foo`.\\n\\n To fix this issue, add `dyn` before the trait name.\\n\\n ```\\n trait Foo {}\\n fn test(arg: Box) {} // ok!\\n ```\\n\\n This used to be allowed before edition 2021, but is now an error.\\n\\n:::danger Rant\\n\\nNot all of the explanations are helpful though, in some cases they might be even\\nmore confusing than helpful, since they address _very simple_ use cases.\\n\\nAs you can see, even in this case there are two sides to the explanations:\\n\\n- it explains why you need to use `dyn`, but\\n- it still mentions that trait objects need to be heap-allocated via `Box`\\n that, as you can see in my snippet, **does not** apply here :smile: IMO it\'s\\n caused by the fact that we are borrowing it and therefore we don\'t need to\\n care about the size or whereabouts of it.\\n\\n:::\\n\\n:::info C++ parallel\\n\\nIf you dive into the explanation above, you can notice that the `Box`\\npattern is very helpful for using types that are not known during compile-time.\\nYou would use a very similar approach in C++ when parsing some data structure\\nfrom input (let\'s say JSON for example).\\n\\nOn the other hand, in this case, it doesn\'t really make much sense, cause you\\ncan clearly see that the types **are known** during the compile-time, which in\\nC++ could be easily resolved by templating the helper function.\\n\\n:::\\n\\n## [Day 24: Blizzard Basin](https://adventofcode.com/2022/day/24)\\n\\n:::info tl;dr\\n\\nNavigating your way through a basin with series of blizzards that move around\\nyou as you move.\\n\\n:::\\n\\n:::caution\\n\\nIt\'s second to last day and I went \u201c_bonkers_\u201d on the Rust :smile: Proceed to\\nread _Solution_ part on your own risk.\\n\\n:::\\n\\n### Solution\\n\\nYou are given a map with blizzards all over the place and you\'re supposed to\\nfind the minimum time it requires you to walk through the basin without getting\\nin any of the blizzards.\\n\\n#### Breakdown\\n\\nRelatively simple, yet a bit annoying, approach can be taken. It\'s technically\\na shortest-path algorithm implementation with some relaxation restrictions and\\nbeing able to stay on one position for some time, so each _vertex_ of the graph\\nis determined by the position on the map and the _timestamp_. I have chosen to\\nuse `Vector3D`, since `x` and `y` attributes can be used for the position\\nand, well, let\'s use `z` for a timestamp, cause why not, right? :wink:\\n\\n#### Evaluating the blizzards\\n\\n:::caution\\n\\nI think that this is the most perverted abuse of the traits in the whole 4 weeks\\nof AoC in Rust\u2026\\n\\n:::\\n\\nThe blizzards move along their respective directions in time and loop around in\\ntheir respective row/column. Each vertex holds position **and** time, so we can\\n_just_ index the basin with the vertex itself, right? Yes, we can :smiling_imp:\\n\\n:::tip Fun fact\\n\\nWhile writing this part, I\'ve recognized unnecessary verbosity in the code and\\ncleaned it up a bit. The changed version is shown here and the original was just\\nmore verbose.\\n\\n:::\\n\\nI\'ll skip the boring parts of checking bounds and entry/exit of the basin :wink:\\nWe can easily calculate positions of the blizzards using a modular arithmetics:\\n\\n```rust\\nimpl Index for Basin {\\n type Output = char;\\n\\n fn index(&self, index: Position) -> &Self::Output {\\n // \u2039skipped boring parts\u203a\\n\\n // We need to account for the loops of the blizzards\\n let width = self.cols - 2;\\n let height = self.rows - 2;\\n\\n let blizzard_origin = |size, d, t, i| ((i - 1 + size + d * (t % size)) % size + 1) as usize;\\n [\\n (\\n index.y() as usize,\\n blizzard_origin(width, -1, index.z(), index.x()),\\n \'>\',\\n ),\\n (\\n index.y() as usize,\\n blizzard_origin(width, 1, index.z(), index.x()),\\n \'<\',\\n ),\\n (\\n blizzard_origin(height, -1, index.z(), index.y()),\\n index.x() as usize,\\n \'v\',\\n ),\\n (\\n blizzard_origin(height, 1, index.z(), index.y()),\\n index.x() as usize,\\n \'^\',\\n ),\\n ]\\n .iter()\\n .find_map(|&(y, x, direction)| {\\n if self.map[y][x] == direction {\\n Some(&self.map[y][x])\\n } else {\\n None\\n }\\n })\\n .unwrap_or(&\'.\')\\n }\\n}\\n```\\n\\nAs you can see, there is an expression for calculating the original position and\\nit\'s used multiple times, so why not take it out to a lambda, right? :wink:\\n\\nI couldn\'t get the `rustfmt` to format the `for`-loop nicely, so I\'ve just\\ndecided to go with iterating over an elements of a slice. I have used, once\\nagain, a combination of two functions (`find_map` in this case) to do 2 things\\nat once and at the end, if we haven\'t found any blizzard, we just return the\\nempty space.\\n\\nI think it\'s a very _nice_ (and naughty) way how to use the `Index` trait, don\'t\\nyou think?\\n\\n#### Shortest-path algorithm\\n\\nFor the shortest path you can choose and adjust any of the common shortest-path\\nalgorithms, in my case, I have decided to use [_A\\\\*_] instead of Dijkstra\'s\\nalgorithm, since it better reflects the _cost_ function.\\n\\n:::info Comparison of costs\\n\\nWith the Dijkstra\'s algorithm I would proceed with the `time` attribute used as\\na priority for the queue.\\n\\nWhereas with the _A\\\\*_, I have chosen to use both time and Manhattan distance\\nthat promotes vertices closer to the exit **and** with a minimum time taken.\\n\\n:::\\n\\nCost function is, of course, a closure :wink:\\n\\n```rust\\nlet cost = |p: Position| p.z() as usize + exit.y().abs_diff(p.y()) + exit.x().abs_diff(p.x());\\n```\\n\\nAnd also for checking the possible moves from the current vertex, I have\\nimplemented, yet another, closure that yields an iterator with the next moves:\\n\\n```rust\\nlet next_positions = |p| {\\n [(0, 0, 1), (0, -1, 1), (0, 1, 1), (-1, 0, 1), (1, 0, 1)]\\n .iter()\\n .filter_map(move |&(x, y, t)| {\\n let next_p = p + Vector3D::new(x, y, t);\\n\\n if basin[next_p] == \'.\' {\\n Some(next_p)\\n } else {\\n None\\n }\\n })\\n};\\n```\\n\\n#### Min-heap\\n\\nIn this case I had a need to use the priority queue taking the elements with the\\nlowest cost as the prioritized ones. Rust only offers you the [`BinaryHeap`] and\\nthat is a max-heap. One of the ways how to achieve a min-heap is to put the\\nelements in wrapped in a [`Reverse`] (as is even showed in the linked [docs of\\nthe `BinaryHeap`]). However the wrapping affects the type of the heap and also\\npopping the most prioritized elements yields values wrapped in the `Reverse`.\\n\\nFor this purpose I have just taken the max-heap and wrapped it as a whole in a\\nseparate structure providing just the desired methods:\\n\\n```rust\\nuse std::cmp::{Ord, Reverse};\\nuse std::collections::BinaryHeap;\\n\\npub struct MinHeap {\\n heap: BinaryHeap>,\\n}\\n\\nimpl MinHeap {\\n pub fn new() -> MinHeap {\\n MinHeap {\\n heap: BinaryHeap::new(),\\n }\\n }\\n\\n pub fn push(&mut self, item: T) {\\n self.heap.push(Reverse(item))\\n }\\n\\n pub fn pop(&mut self) -> Option {\\n self.heap.pop().map(|Reverse(x)| x)\\n }\\n}\\n\\nimpl Default for MinHeap {\\n fn default() -> Self {\\n Self::new()\\n }\\n}\\n```\\n\\nRest is just the algorithm implementation which is not that interesting.\\n\\n## [Day 25: Full of Hot Air](https://adventofcode.com/2022/day/25)\\n\\n:::info tl;dr\\n\\nPlaying around with a numbers in a _special_ base.\\n\\n:::\\n\\nGetting flashbacks to the _IB111 Foundations of Programming_\u2026 Very nice \u201cproblem\u201d\\nwith a rather easy solution, as the last day always seems to be.\\n\\n### Solution\\n\\nImplementing 2 functions, converting from the _SNAFU base_ and back to the _SNAFU_\\n_base_ representation. Let\'s do a bit more though! I have implemented two functions:\\n\\n- `from_snafu`\\n- `to_snafu`\\n\\nNow it is apparent that all I do is number to string and string to number. Hmm\u2026\\nthat sounds familiar, doesn\'t it? Let\'s introduce a structure for the SNAFU numbers\\nand implement the traits that we need.\\n\\nLet\'s start with a structure:\\n\\n```rust\\n#[derive(Debug, PartialEq, Eq, PartialOrd, Ord)]\\nstruct SNAFU {\\n value: i64,\\n}\\n```\\n\\n#### Converting from `&str`\\n\\nWe will start by implementing the `FromStr` trait that will help us parse our input.\\nThis is rather simple, I can just take the `from_snafu` function, copy-paste it\\ninto the `from_str` method and the number I get will be wrapped in `Result` and\\n`SNAFU` structure.\\n\\n#### Converting to `String`\\n\\nThis is more fun. In some cases you need to implement only one trait and others\\nare automatically implemented using that one trait. In our case, if you look in\\nthe documentation, you can see that `ToString` trait is automatically implemented\\nfor any type that implements `Display` trait.\\n\\nLet\'s implement the `Display` trait then. We should be able to use the `to_snafu`\\nfunction and just take the `self.value` from the `SNAFU` structure.\\n\\nAnd for the convenience of tests, we can also implement a rather simple `From`\\ntrait for the `SNAFU`.\\n\\n#### Adjusting the code\\n\\nAfter those changes we need to adjust the code and tests.\\n\\nParsing of the input is very easy, before we have used the lines, now we parse\\neverything:\\n\\n```diff\\n fn parse_input>(pathname: P) -> Input {\\n- file_to_lines(pathname)\\n+ file_to_structs(pathname)\\n }\\n```\\n\\nPart 1 needs to be adjusted a bit too:\\n\\n```diff\\n fn part_1(input: &Input) -> Output {\\n- to_snafu(input.iter().map(|s| from_snafu(s)).sum())\\n+ SNAFU::from(input.iter().map(|s| s.value).sum::()).to_string()\\n }\\n```\\n\\nYou can also see that it simplifies the meaning a bit and it is more explicit than\\nthe previous versions.\\n\\nAnd for the tests:\\n\\n```diff\\n #[test]\\n fn test_from() {\\n- for (n, s) in EXAMPLES.iter() {\\n- assert_eq!(from_snafu(s), *n);\\n+ for (&n, s) in EXAMPLES.iter() {\\n+ assert_eq!(s.parse::().unwrap().value, n);\\n }\\n }\\n\\n #[test]\\n fn test_to() {\\n- for (n, s) in EXAMPLES.iter() {\\n- assert_eq!(to_snafu(*n), s.to_string());\\n+ for (&n, s) in EXAMPLES.iter() {\\n+ assert_eq!(SNAFU::from(n).to_string(), s.to_string());\\n }\\n```\\n\\n## Summary\\n\\nLet\'s wrap the whole thing up! Keeping in mind both AoC and the Rust\u2026\\n\\n![Finished advent calendar :smile:](/img/blog/aoc-2022/04-week-4/calendar.png)\\n\\n### Advent of Code\\n\\nThis year was quite fun, even though most of the solutions and posts came in\\nlater on (_cough_ in \'23 _cough_). Day 22 was the most obnoxious one\u2026 And also\\nit feels like I used priority queues and tree data structures **a lot** :eyes:\\n\\n### with Rust\\n\\nI must admit that a lot of compiler warnings and errors were very useful. Even\\nthough I still found some instances where they didn\'t help at all or cause even\\nworse issues than I had. Compilation times have been addressed with the caching.\\n\\nBuilding my first tree data structure in Rust has been a very \u201cinteresting\u201d\\njourney. Being able to write a more generic BFS algorithm that allows you to not\\nduplicate code while still mantaining the desired functionality contributes to\\na very readable code.\\n\\nI am definitely much more aware of the basic things that bloated Python is\\nmissing, yet Rust has them\u2026\\n\\nUsing explicit types and writing down placeholder functions with `todo!()`\\nmacros is very pleasant, since it allows you to easily navigate the type system\\nduring the development when you don\'t even need to be sure how are you going to\\nput the smaller pieces together.\\n\\nI have used a plethora of traits and also implemented some of them to either be\\nidiomatic, or exploit the syntactic sugar they offer. Deriving the default trait\\nimplementation is also very helpful in a lot of cases, e.g. debugging output,\\ncopying, equality comparison, etc.\\n\\nI confess to touching more \u201ccursed\u201d parts of the Rust, such as macros to\\ndeclutter the copy-paste for tests or writing my own structures that need to\\ncarry a lifetime for their own fields.\\n\\ntl;dr Relatively pleasant language until you hit brick wall :wink:\\n\\n---\\n\\nSee you next year! Maybe in Rust, maybe not :upside_down_face:\\n\\n[_advent of code_]: https://adventofcode.com\\n[_a\\\\*_]: https://en.wikipedia.org/wiki/A*_search_algorithm\\n[`binaryheap`]: https://doc.rust-lang.org/std/collections/struct.BinaryHeap.html\\n[`reverse`]: https://doc.rust-lang.org/std/cmp/struct.Reverse.html\\n[docs of the `binaryheap`]: https://doc.rust-lang.org/std/collections/struct.BinaryHeap.html#min-heap"},{"id":"aoc-2022/3rd-week","metadata":{"permalink":"/blog/aoc-2022/3rd-week","editUrl":"https://github.com/mfocko/blog/tree/main/blog/aoc-2022/03-week-3.md","source":"@site/blog/aoc-2022/03-week-3.md","title":"3rd week of Advent of Code \'22 in Rust","description":"Surviving third week in Rust.","date":"2023-07-06T21:00:00.000Z","formattedDate":"July 6, 2023","tags":[{"label":"advent-of-code","permalink":"/blog/tags/advent-of-code"},{"label":"advent-of-code-2022","permalink":"/blog/tags/advent-of-code-2022"},{"label":"rust","permalink":"/blog/tags/rust"}],"readingTime":11.57,"hasTruncateMarker":true,"authors":[{"name":"Matej Focko","email":"me+blog@mfocko.xyz","title":"a.k.a. @mf","url":"https://gitlab.com/mfocko","imageURL":"https://github.com/mfocko.png","key":"mf"}],"frontMatter":{"title":"3rd week of Advent of Code \'22 in Rust","description":"Surviving third week in Rust.","date":"2023-07-06T21:00","slug":"aoc-2022/3rd-week","authors":"mf","tags":["advent-of-code","advent-of-code-2022","rust"],"hide_table_of_contents":false},"unlisted":false,"prevItem":{"title":"4th week of Advent of Code \'22 in Rust","permalink":"/blog/aoc-2022/4th-week"},"nextItem":{"title":"Sort the matrix diagonally","permalink":"/blog/leetcode/sort-diagonally"}},"content":"Let\'s go through the third week of [_Advent of Code_] in Rust.\\n\\n\x3c!--truncate--\x3e\\n\\n## [Day 15: Beacon Exclusion Zone](https://adventofcode.com/2022/day/15)\\n\\n:::info tl;dr\\n\\nTriangulating a distress beacon based on the information from the sensors.\\n\\n:::\\n\\n### Solution\\n\\nRelatively easy thing to implement, no major Rust issues hit.\\n\\n## [Day 16: Proboscidea Volcanium](https://adventofcode.com/2022/day/16)\\n\\n:::info tl;dr\\n\\nFinding a max flow in a graph given some time constraints.\\n\\n:::\\n\\n### Solution\\n\\nI have used some interesting things to implement this and make it easier for me.\\n\\n#### Indexing in graph\\n\\nI have come across a situation where I needed to keep more information regarding\\nthe graph\u2026 In that case you can, of course, create a structure and keep it in,\\nbut once you have multiple members in the structure it gets harder to work with\\nsince you need to address the fields in the structure. When you work with graph,\\nyou frequently need to access the vertices and in this case it felt a lot easier\\nto implement the indexing in a graph, rather than explicitly access the\\nunderlying data structure.\\n\\nHere you can see a rather short snippet from the solution that allows you to\\n\u201cindex\u201d the graph:\\n\\n```rust\\nimpl Index<&str> for Graph {\\n type Output = Vertex;\\n\\n fn index(&self, index: &str) -> &Self::Output {\\n &self.g[index]\\n }\\n}\\n```\\n\\n#### Cartesian product\\n\\nDuring the implementation I had to utilize Floyd-Warshall algorithm for finding\\nthe shortest path between pairs of vertices and utilized the `iproduct!` macro\\nfrom the [`itertools`]. It is a very useful higher-order function that allows\\nyou to keep the nesting of the loops at a minimum level while still maintaining\\nthe same functionality.\\n\\n#### \u201cImplementing\u201d an iterator\\n\\nFor the second part, you get to split the work between 2 actors. That way you\\ncan achieve higher efficiency of the whole process that you\'re planning, but it\\nalso makes it harder to evaluate algorithmically, since you need to check the\\ndifferent ways the work can be split.\\n\\nBeing affected by _functional programming brain damage_:tm:, I have chosen to\\ndo this part by function that returns an iterator over the possible ways:\\n\\n```rust\\nfn pairings(\\n valves: &BTreeSet,\\n) -> impl Iterator, BTreeSet)> + \'_ {\\n let mapping = valves.iter().collect_vec();\\n\\n let max_mask = 1 << (valves.len() - 1);\\n\\n (0..max_mask).map(move |mask| {\\n let mut elephant = BTreeSet::new();\\n let mut human = BTreeSet::new();\\n\\n for (i, &v) in mapping.iter().enumerate() {\\n if (mask & (1 << i)) == 0 {\\n human.insert(v.clone());\\n } else {\\n elephant.insert(v.clone());\\n }\\n }\\n\\n (human, elephant)\\n })\\n}\\n```\\n\\n## [Day 17: Pyroclastic Flow](https://adventofcode.com/2022/day/17)\\n\\n:::info tl;dr\\n\\nSimulating an autonomous Tetris where pieces get affected by a series of jets of\\nhot gas.\\n\\n:::\\n\\n### Solution\\n\\nSimilarly to the previous day I have created some iterators :smile:\\n\\n#### Collision detection\\n\\nOnce you need to check for collisions it is very helpful to be able to just\\niterate through the positions that can actually collide with the wall or other\\npiece.\\n\\nTo get the desired behaviour, you can just compose few smaller functions:\\n\\n```rust\\nfn occupied(shape: &[Vec]) -> impl Iterator + \'_ {\\n shape.iter().enumerate().flat_map(|(y, row)| {\\n row.iter().enumerate().filter_map(move |(x, c)| {\\n if c == &\'#\' {\\n Some(Vector2D::new(x as isize, y as isize))\\n } else {\\n None\\n }\\n })\\n })\\n}\\n```\\n\\nIn the end, we get relative positions which we can adjust later when given the\\nspecific positions from iterator. You can see some interesting parts in this:\\n\\n- `.enumerate()` allows us to get both the indices (coordinates) and the line\\n or, later on, the character itself,\\n- `.flat_map()` flattens the iterator, i.e. when we return another iterator,\\n they just get chained instead of iterating over iterators (which sounds pretty\\n disturbing, doesn\'t it?),\\n- and finally `.filter_map()` which is pretty similar to the \u201cbasic\u201d `.map()`\\n with a one, key, difference that it expects the items of an iterator to be\\n mapped to an `Option` from which it ignores nothing (as in `None` :wink:)\\n and also unwraps the values from `Some(\u2026)`.\\n\\n#### Infinite iterator\\n\\nIn the solution we cycle through both Tetris-like shapes that fall down and the\\njets that move our pieces around. Initially I have implemented my own infinite\\niterator that just yields the indices. It is a very simple, yet powerful, piece\\nof code:\\n\\n```rust\\nstruct InfiniteIndex {\\n size: usize,\\n i: usize,\\n}\\n\\nimpl InfiniteIndex {\\n fn new(size: usize) -> InfiniteIndex {\\n InfiniteIndex { size, i: size - 1 }\\n }\\n}\\n\\nimpl Iterator for InfiniteIndex {\\n type Item = usize;\\n\\n fn next(&mut self) -> Option {\\n self.i = (self.i + 1) % self.size;\\n Some(self.i)\\n }\\n}\\n```\\n\\nHowever when I\'m looking at the code now, it doesn\'t really make much sense\u2026\\nGuess what, we can use a built-in function that is implemented on iterators for\\nthat! The function is called `.cycle()`\\n\\nOn the other hand, I am not going to switch to that function, since it would\\nintroduce an another myriad of issues caused by the fact that I create iterators\\nright away in the constructor of my structure and the iterators would borrow\\nboth the jets and shapes which would introduce a lifetime dependency into the\\nstructure.\\n\\n## [Day 18: Boiling Boulders](https://adventofcode.com/2022/day/18)\\n\\n:::info tl;dr\\n\\nLet\'s compute a surface area of some obsidian approximated via coordinates of\\ncubes.\\n\\n:::\\n\\n### Solution\\n\\nThis day is kinda interesting, because it shows how easily you can complicate the\\nproblem and also how much can you screw yourself over with the optimization and\\n\u201csmart\u201d approach.\\n\\nFor the first part you need to find the surface area of an obsidian that is\\napproximated by cubes. Now, that is a very easy thing to do, just keep the track\\nof already added cubes, and check if the newly added cube touches any face of any\\nother cube. Simple, and with a `BTreeSet` relatively efficient way to do it.\\n\\nHowever the second part lets you on a secret that there may be some surface area\\nfrom the \u201cinside\u201d too and you want to know only the one from the outside of the\\nobsidian. I have seen some solutions later, but if you check your data, you might\\nnotice that the bounding box of all the cubes isn\'t that big at all. Therefore I\\nchose to pre-construct the box beforehand, fill in the cubes and then just run a\\nBFS turning all the lava on the outside into the air. Now you just need to check\\ncubes and count how many of their faces touch the air.\\n\\n## [Day 19: Not Enough Minerals](https://adventofcode.com/2022/day/19)\\n\\n:::info tl;dr\\n\\nFinding out the best strategy for building robots to collect geodes.\\n\\n:::\\n\\n### Solution\\n\\nNot much interesting stuff to mention apart from the suggestion to never believe\\nthat the default implementation given by `derive` macro is what you want, it\\ndoesn\'t have to be. :smile:\\n\\n## [Day 20: Grove Positioning System](https://adventofcode.com/2022/day/20)\\n\\n:::info tl;dr\\n\\nShuffling around the _circular linked list_ to find the coordinates.\\n\\n:::\\n\\nNow, small rant for this day is in place. They\'ve never mentioned that coordinates\\ncan repeat and therefore the values are non-unique. This is something that did\\nnot happen in the given sample, but was present in the user input. It took \xbba lot\xab\\nto realize that this is the issue.\\n\\n### Solution\\n\\nI have tried implementing a circular linked list for this\u2026 and I have failed\\nmiserably. To be fair, I still have no clue why. It was \u201cfun\u201d to play around with\\nthe `Rc>`. In the end I failed on _wrong answer_. I have also encountered\\na rather interesting issue with `.borrow_mut()` method being used on `Rc>`.\\n\\n#### `.borrow_mut()`\\n\\nConsider the following snippet of the code (taken from the documentation):\\n\\n```rust\\nuse std::cell::{RefCell, RefMut};\\nuse std::collections::HashMap;\\nuse std::rc::Rc;\\n// use std::borrow::BorrowMut;\\n\\nfn main() {\\n let shared_map: Rc> = Rc::new(RefCell::new(HashMap::new()));\\n // Create a new block to limit the scope of the dynamic borrow\\n {\\n let mut map: RefMut<_> = shared_map.borrow_mut();\\n map.insert(\\"africa\\", 92388);\\n map.insert(\\"kyoto\\", 11837);\\n map.insert(\\"piccadilly\\", 11826);\\n map.insert(\\"marbles\\", 38);\\n }\\n\\n // Note that if we had not let the previous borrow of the cache fall out\\n // of scope then the subsequent borrow would cause a dynamic thread panic.\\n // This is the major hazard of using `RefCell`.\\n let total: i32 = shared_map.borrow().values().sum();\\n println!(\\"{total}\\");\\n}\\n```\\n\\nWe allocate a hash map on the heap and then in the inner block, we borrow it as\\na mutable reference, so that we can use it.\\n\\n:::note\\n\\nIt is a very primitive example for `Rc>` and mutable borrow.\\n\\n:::\\n\\nIf you uncomment the 4th line with `use std::borrow::BorrowMut;`, you cannot\\ncompile the code anymore, because of\\n\\n```\\n Compiling playground v0.0.1 (/playground)\\nerror[E0308]: mismatched types\\n --\x3e src/main.rs:10:34\\n |\\n10 | let mut map: RefMut<_> = shared_map.borrow_mut();\\n | --------- ^^^^^^^^^^^^^^^^^^^^^^^ expected struct `RefMut`, found mutable reference\\n | |\\n | expected due to this\\n |\\n = note: expected struct `RefMut<\'_, _>`\\n found mutable reference `&mut Rc>>`\\n\\nerror[E0599]: no method named `insert` found for struct `RefMut<\'_, _>` in the current scope\\n --\x3e src/main.rs:11:13\\n |\\n11 | map.insert(\\"africa\\", 92388);\\n | ^^^^^^ method not found in `RefMut<\'_, _>`\\n\\nerror[E0599]: no method named `insert` found for struct `RefMut<\'_, _>` in the current scope\\n --\x3e src/main.rs:12:13\\n |\\n12 | map.insert(\\"kyoto\\", 11837);\\n | ^^^^^^ method not found in `RefMut<\'_, _>`\\n\\nerror[E0599]: no method named `insert` found for struct `RefMut<\'_, _>` in the current scope\\n --\x3e src/main.rs:13:13\\n |\\n13 | map.insert(\\"piccadilly\\", 11826);\\n | ^^^^^^ method not found in `RefMut<\'_, _>`\\n\\nerror[E0599]: no method named `insert` found for struct `RefMut<\'_, _>` in the current scope\\n --\x3e src/main.rs:14:13\\n |\\n14 | map.insert(\\"marbles\\", 38);\\n | ^^^^^^ method not found in `RefMut<\'_, _>`\\n\\nSome errors have detailed explanations: E0308, E0599.\\nFor more information about an error, try `rustc --explain E0308`.\\nerror: could not compile `playground` due to 5 previous errors\\n```\\n\\nIt might seem **a bit** ridiculous. However, I got to a point where the compiler\\nsuggested `use std::borrow::BorrowMut;` and it resulted in breaking parts of the\\ncode that worked previously. I think it may be a good idea to go over what is\\nhappening here.\\n\\n##### `.borrow_mut()` on `Rc>`\\n\\nLet\'s consider a variable `x` of type `Rc>`. What happens when you\\ncall `.borrow_mut()` on it? We can look at the `Rc` type, and\u2026 hang on! There is\\nneither `.borrow_mut()` method or `BorrowMut` trait implemented. How can we do it\\nthen?\\n\\nLet\'s go further and we can see that `RefCell` implements a `.borrow_mut()`\\nmethod. OK, but how can we call it on the `Rc`? Easily! `Rc` implements\\n`Deref` and therefore you can call methods on `Rc` objects as if they were\\n`T` objects. If we read on _`Deref` coercion_, we can see the following:\\n\\n> If `T` implements `Deref`, \u2026:\\n>\\n> - \u2026\\n> - `T` implicitly implements all the (immutable) methods of the type `U`.\\n\\nWhat is the requirement for the `.borrow_mut()` on `RefCell`? Well, it needs\\n`&self`, so the `Deref` implements the `.borrow_mut()` for the `Rc>`.\\n\\n##### `BorrowMut` trait\\n\\nI have not been able to find a lot on this trait. My guess is that it provides a\\nmethod instead of a syntactic sugar (`&mut x`) for the mutable borrow. And also\\nit provides default implementations for the types:\\n\\n```rust\\nimpl BorrowMut for String\\n\\nimpl BorrowMut for &mut T\\nwhere\\n T: ?Sized,\\n\\nimpl BorrowMut for T\\nwhere\\n T: ?Sized,\\n\\nimpl BorrowMut<[T]> for Vec\\nwhere\\n A: Allocator,\\n\\nimpl BorrowMut for Box\\nwhere\\n A: Allocator,\\n T: ?Sized,\\n\\nimpl BorrowMut<[T]> for [T; N]\\n```\\n\\n##### Conflict\\n\\nNow the question is why did it break the code\u2026 My first take was that the type\\n`Rc>` has some _specialized_ implementation of the `.borrow_mut()` and\\nthe `use` overrides it with the default, which is true **in a sense**. However\\nthere is no _specialized_ implementation. Let\'s have a look at the trait and the\\ntype signature on the `RefCell`:\\n\\n```rust\\n// trait\\npub trait BorrowMut: Borrow\\nwhere\\n Borrowed: ?Sized,\\n{\\n fn borrow_mut(&mut self) -> &mut Borrowed;\\n}\\n\\n// \u2039RefCell.borrow_mut()\u203a type signature\\npub fn borrow_mut(&self) -> RefMut<\'_, T>\\n```\\n\\nI think that we can definitely agree on the fact that `RefMut<\'_, T>` is not the\\n`RefCell`.\\n\\n**In my opinion**, `RefCell` implements a **separate** `.borrow_mut()` rather\\nthan implementing the interface, because it **cannot** satisfy the type requirements\\nof the trait.\\n\\n:::caution\\n\\nI wonder how are we expected to deal with this conflict, if and when, we need\\nboth the `.borrow_mut()` of the trait and `.borrow_mut()` of the `RefCell`.\\n\\n:::\\n\\n:::tip Fun fact\\n\\nI was suggested by the compiler to do `use std::borrow::BorrowMut;` and break the\\ncode.\\n\\nSo much for the _almighty_ and _helpful_ compiler\u2026\\n\\n:::\\n\\n## [Day 21: Monkey Math](https://adventofcode.com/2022/day/21)\\n\\n:::info tl;dr\\n\\nComputing an expression tree and then also finding ideal value for a node.\\n\\n:::\\n\\n### Solution\\n\\nRelatively simple, until you get to the 2nd part where you start to practice\\na lot of the copy-paste. I have managed to sneak some perverted stuff in there\\nthough :) Let\'s go through the details.\\n\\n#### `Default` trait\\n\\nFor the first time and twice I had a need to have a default value for my types,\\nenumerations in this case. Rust offers a very nice trait[^1] that is described\\nas:\\n\\n> A trait for giving a type a useful default value.\\n\\nI guess it sums it up nicely. The more interesting part about this is the fact\\nthat you can use the _macro machinery_ to save yourself some typing. If you have\\nenumeration of which the default value doesn\'t bear any parameter, you can just\\ndo[^2]:\\n\\n```rust\\n#[derive(Default)]\\nenum Color {\\n #[default]\\n White,\\n Gray,\\n Black,\\n}\\n```\\n\\n#### Abusing negation\\n\\nIf you want to use a _unary minus_ operator on your own type, you can implement\\na `Neg` trait[^3]. I was dealing with a binary tree and needed a way how to look\\nat the other side, so I have just implemented the negation for flipping between\\nleft and right :smile:\\n\\n[^1]: [`Default`](https://doc.rust-lang.org/std/default/trait.Default.html) docs\\n[^2]: Pardon my example from the graph algorithms ;)\\n[^3]: [`Neg`](https://doc.rust-lang.org/std/ops/trait.Neg.html) docs\\n\\n[_advent of code_]: https://adventofcode.com\\n[`itertools`]: https://crates.io/crates/itertools\\n[this reddit post and the comment]: https://www.reddit.com/r/adventofcode/comments/zb98pn/comment/iyq0ono"},{"id":"leetcode/sort-diagonally","metadata":{"permalink":"/blog/leetcode/sort-diagonally","editUrl":"https://github.com/mfocko/blog/tree/main/blog/leetcode/sort-matrix-diagonally.md","source":"@site/blog/leetcode/sort-matrix-diagonally.md","title":"Sort the matrix diagonally","description":"Compiler assisted development.","date":"2023-03-04T23:15:00.000Z","formattedDate":"March 4, 2023","tags":[{"label":"cpp","permalink":"/blog/tags/cpp"},{"label":"leetcode","permalink":"/blog/tags/leetcode"},{"label":"iterators","permalink":"/blog/tags/iterators"}],"readingTime":16.99,"hasTruncateMarker":true,"authors":[{"name":"Matej Focko","email":"me+blog@mfocko.xyz","title":"a.k.a. @mf","url":"https://gitlab.com/mfocko","imageURL":"https://github.com/mfocko.png","key":"mf"}],"frontMatter":{"title":"Sort the matrix diagonally","description":"Compiler assisted development.","date":"2023-03-04T23:15","slug":"leetcode/sort-diagonally","authors":"mf","tags":["cpp","leetcode","iterators"],"hide_table_of_contents":false},"unlisted":false,"prevItem":{"title":"3rd week of Advent of Code \'22 in Rust","permalink":"/blog/aoc-2022/3rd-week"},"nextItem":{"title":"2nd week of Advent of Code \'22 in Rust","permalink":"/blog/aoc-2022/2nd-week"}},"content":"Let\'s try to solve one of the LeetCode challenges in easy and hard mode at the\\nsame time.\\n\\n\x3c!--truncate--\x3e\\n\\n- Link to the problem: https://leetcode.com/problems/sort-the-matrix-diagonally/\\n\\n## Problem description\\n\\nA **matrix diagonal** is a diagonal line of cells starting from some cell in\\neither the topmost row or leftmost column and going in the bottom-right direction\\nuntil reaching the matrix\'s end. For example, the **matrix diagonal** starting\\nfrom `mat[2][0]`, where `mat` is a `6 x 3` matrix, includes cells `mat[2][0]`,\\n`mat[3][1]`, and `mat[4][2]`.\\n\\nGiven an `m x n` matrix `mat` of integers, sort each matrix diagonal in ascending\\norder and return the resulting matrix.\\n\\n### Example\\n\\n![Image describing the problem](https://assets.leetcode.com/uploads/2020/01/21/1482_example_1_2.png)\\n\\n## Skeleton and initial adjustments\\n\\nWe are given the following skeleton for the C++ and the given challenge:\\n\\n```cpp\\nclass Solution {\\npublic:\\n vector> diagonalSort(vector>& mat) {\\n\\n }\\n};\\n```\\n\\nThe task is to sort the passed matrix diagonally and then return it. First of all,\\nI don\'t like to solve this in a web browser, so we\'ll need to adjust it accordingly\\nfor running it locally. We\'ll start by including the `vector` header and using\\nfully-qualified namespaces[^1] and also adding few tests:\\n\\n```cpp\\n#include \\n#include \\n\\nusing matrix = std::vector>;\\n\\nclass Solution {\\npublic:\\n matrix diagonalSort(matrix& mat)\\n {\\n }\\n};\\n\\nstatic void test_case_1()\\n{\\n // Input: mat = [[3,3,1,1],[2,2,1,2],[1,1,1,2]]\\n // Output: [[1,1,1,1],[1,2,2,2],[1,2,3,3]]\\n\\n Solution s;\\n assert((s.diagonalSort(std::vector { std::vector { 3, 3, 1, 1 },\\n std::vector { 2, 2, 1, 2 },\\n std::vector { 1, 1, 1, 2 } })\\n == std::vector { std::vector { 1, 1, 1, 1 },\\n std::vector { 1, 2, 2, 2 },\\n std::vector { 1, 2, 3, 3 } }));\\n}\\n\\nstatic void test_case_2()\\n{\\n // Input: mat =\\n // [[11,25,66,1,69,7],[23,55,17,45,15,52],[75,31,36,44,58,8],[22,27,33,25,68,4],[84,28,14,11,5,50]]\\n // Output:\\n // [[5,17,4,1,52,7],[11,11,25,45,8,69],[14,23,25,44,58,15],[22,27,31,36,50,66],[84,28,75,33,55,68]]\\n\\n Solution s;\\n assert((s.diagonalSort(std::vector { std::vector { 11, 25, 66, 1, 69, 7 },\\n std::vector { 23, 55, 17, 45, 15, 52 },\\n std::vector { 75, 31, 36, 44, 58, 8 },\\n std::vector { 22, 27, 33, 25, 68, 4 },\\n std::vector { 84, 28, 14, 11, 5, 50 } })\\n == std::vector { std::vector { 5, 17, 4, 1, 52, 7 },\\n std::vector { 11, 11, 25, 45, 8, 69 },\\n std::vector { 14, 23, 25, 44, 58, 15 },\\n std::vector { 22, 27, 31, 36, 50, 66 },\\n std::vector { 84, 28, 75, 33, 55, 68 } }));\\n}\\n\\nint main()\\n{\\n test_case_1();\\n test_case_2();\\n\\n return 0;\\n}\\n```\\n\\nWe need to return the matrix, but we\'re given a reference to the input matrix. We\\ncan easily abuse the C++ here and just switch the reference to value, this way\\nthe matrix will be copied when passed to the function, we can sort the copy and\\njust return it back. And we also get yelled by the compiler for the fact that the\\nmethod doesn\'t return anything yet, so to make it \u201cshut up\u201d we will just return\\nthe input for now:\\n\\n```diff\\n- matrix diagonalSort(matrix& mat)\\n+ matrix diagonalSort(matrix mat)\\n {\\n+ return mat;\\n }\\n```\\n\\nNow, we get the copy and we\'re good to go.\\n\\n## Na\xefve solution\\n\\nAs you may know, C++ offers a plethora of functions that can be used to your\\nadvantage, given that you know how to \u201cbend\u201d the data structures accordingly.\\n\\nWhat does that mean for us? Well, we have an `std::sort`, we can use it, right?\\nLet\'s have a look at it:\\n\\n```cpp\\ntemplate< class RandomIt >\\nvoid sort( RandomIt first, RandomIt last );\\n```\\n\\nThis overload is more than we need. What does it do? It just sorts the elements\\nin the range `[first, last)` using `operator<` on them. We can\'t sort the whole\\nmatrix using this, but\u2026 we can sort just \xbbone\xab diagonal without doing much work\\non our end.\\n\\nWhat is the `RandomIt` type though? If we look more into the documentation, we\\ncan easily find the requirements for it and also learn that it\'s a _random access_\\n_iterator_ and allows swapping its values at the same time.\\n\\n:::tip Random access iterator\\n\\nWhat is the _random access iterator_ though? We can find it in a documentation\\nand see the following description:\\n\\n> A **LegacyRandomAccessIterator** is a [LegacyBidirectionalIterator](https://en.cppreference.com/w/cpp/named_req/BidirectionalIterator)\\n> that can be moved to point to any element in constant time.\\n\\nAfter that we can see all the requirements for it being listed. I don\'t feel like\\nreading them right now, so we will just use it and see where the compilation blows\\nup, i.e. \u201c_compiler-assisted development_\u201d[^2] if you will ;)\\n\\n:::\\n\\nNow we know that we can use `std::sort` to sort the diagonal itself, but we also\\nneed to get the diagonals somehow. I\'m rather lazy, so I\'ll just delegate it to\\nsomeone else[^3]. And that way we get\\n\\n```cpp\\nmatrix diagonalSort(matrix mat)\\n{\\n // we iterate over the diagonals\\n for (auto d : diagonals(mat)) {\\n // and we sort each diagonal\\n std::sort(d.begin(), d.end());\\n }\\n\\n // we take the matrix by copy, so we can sort in-situ and return the copy\\n // that we sorted\\n return mat;\\n}\\n```\\n\\nThis solution looks very simple, doesn\'t it? Well, cause it is.\\nLet\'s try compiling it:\\n\\n```\\nmatrix-sort.cpp:11:23: error: use of undeclared identifier \'diagonals\' [clang-diagnostic-error]\\n for (auto d : diagonals(mat)) {\\n ^\\nFound compiler error(s).\\nmake: *** [makefile:14: tidy] Error 1\\n```\\n\\nOK, seems about right. We haven\'t implemented the `diagonals` yet. And based on\\nwhat we\'ve written so far, we need a function or a class `diagonals` that will\\ngive us the diagonals we need.\\n\\n## Implementing the `diagonals`\\n\\nCool, so we need the function that will let us go through each and every diagonal\\nin our matrix. We use the _for-range_ loop, so whatever we get back from the\\n`diagonals` must support `.begin()` and `.end()`. Since I am a masochist, we will\\ndo such functionality for a matrix of any type, not just the `int` from the challenge.\\n\\nAs I said, we need to be able to\\n\\n- construct the object\\n- get the beginning\\n- get the end (the \u201csentinel\u201d)\\n\\n```cpp\\ntemplate \\nclass diagonals {\\n using matrix_t = std::vector>;\\n\\n matrix_t& _matrix;\\n\\npublic:\\n diagonals(matrix_t& m)\\n : _matrix(m)\\n {\\n }\\n diagonals_iter begin()\\n {\\n /* TODO */\\n }\\n diagonals_iter end()\\n {\\n /* TODO */\\n }\\n};\\n```\\n\\nNow we have a `diagonals` that we can use to go through the diagonals. We haven\'t\\nimplemented the core of it yet. Let\'s go through what we have for now.\\n\\nWe have a templated class with templated `T` that is used as a placeholder for any\\ntype we would store in the matrix. Because I\'m lazy, I have defined the `matrix_t`\\ntype that is a \u201cshortcut\u201d for `std::vector>`, so I don\'t have to\\ntype it out all the time. Of course, we need to store the matrix, we are given,\\nas a private attribute. And then just have the constructor and the 2 methods we\\nneed for the _for-range_.\\n\\n### Iterating over diagonals\\n\\nNow that we have an object that will allow us to iterate through the diagonals,\\nwe need to implement the iterating itself. We need to go through all of them, so\\nwe have multiple options how to do so. I have decided to start from the \u201cmain\u201d\\ndiagonal that starts at `(0, 0)` index and then proceed with the diagonals starting\\nin the first row, followed by the rest of the diagonals in the first column.\\n\\nWe need to be able to tell that we\'ve iterated through all of them, and also we\\nneed to know which diagonal is next. For that purpose we will pass the indices\\nof the first cell on the diagonal. That way we can always tell how to move forward.\\n\\nWe will start by updating the `begin` and `end` to reflect our choice accordingly.\\n\\n```cpp\\ndiagonals_iter begin() { return diagonals_iter { _matrix, 0, 0 }; }\\ndiagonals_iter end() { return diagonals_iter { _matrix, 0, _matrix.size() }; }\\n```\\n\\nFor the `begin` we return the first diagonal that starts at `(0, 0)`. And because\\nwe have decided to do the diagonals in the first column at the end, the first\\ndiagonal that is not a valid one is the one at `(0, height)`. Apart from the\\nindices, we also need to pass reference to the matrix itself.\\n\\n:::note\\n\\nYou may have noticed that we also include the diagonals that have length 1,\\nspecifically the ones at `(0, height - 1)` and `(width - 1, 0)`. We are implementing\\nan iterator that **should not** care about the way it\'s being used. Therefore, we\\ndon\'t care about the fact they don\'t need to be sorted.\\n\\n:::\\n\\nCool, let\'s leave the iterator itself to someone else, right?[^4]\\n\\n### Implementing the iterator over diagonals\\n\\nWe can start with a simple skeleton based on the information that we pass from\\nthe `diagonals`. Also to utilize the `matrix_t` and also contain implementation\\ndetails hidden away, we will put this code into the `diagonals` class.\\n\\n```cpp\\nclass diagonals_iter {\\n matrix_t& m;\\n std::size_t x;\\n std::size_t y;\\n\\npublic:\\n diagonals_iter(matrix_t& matrix, std::size_t x, std::size_t y)\\n : m(matrix)\\n , x(x)\\n , y(y)\\n {\\n }\\n};\\n```\\n\\nIn this case we will be implementing a \u201csimple\u201d forward iterator, so we don\'t\\nneed to implement a lot. Notably it will be:\\n\\n- inequality operator (we need to know when we reach the end and have nothing to\\n iterate over)\\n- preincrementation operator (we need to be able to move around the iterable)\\n- dereference operator (we need to be able to retrieve the objects we iterate\\n over)\\n\\n```cpp\\nclass diagonals_iter {\\n matrix_t& m;\\n std::size_t x;\\n std::size_t y;\\n\\npublic:\\n diagonals_iter(matrix_t& matrix, std::size_t x, std::size_t y)\\n : m(matrix)\\n , x(x)\\n , y(y)\\n {\\n }\\n\\n bool operator!=(const diagonals_iter& rhs) const\\n {\\n // iterators are not equal if they reference different matrices, or\\n // their positions differ\\n return m != rhs.m || x != rhs.x || y != rhs.y;\\n }\\n\\n diagonals_iter& operator++()\\n {\\n if (y != 0) {\\n // iterating through diagonals down the first column\\n y++;\\n return *this;\\n }\\n\\n // iterating the diagonals along the first row\\n x++;\\n if (x == m.front().size()) {\\n // switching to diagonals in the first column\\n x = 0;\\n y++;\\n }\\n\\n return *this;\\n }\\n\\n diagonal operator*() const { return diagonal { m, x, y }; }\\n};\\n```\\n\\nLet\'s go one-by-one. Inequality operator is rather simple, just compare iterator\'s\\nattributes field-by-field. If you think about it, checking inequality of two 2D\\nvectors may be a bit inefficient, therefore, we can swap around and check it as\\na last thing.\\n\\n```diff\\n- return m != rhs.m || x != rhs.x || y != rhs.y;\\n+ return x != rhs.x || y != rhs.y || m != rhs.m;\\n```\\n\\nPreincrementation is where the magic happens. If you have a better look, you can\\nsee two branches of this operation:\\n\\n1. When `y != 0` (we\'re iterating over the diagonals in the first column)\\n In this case, we just bump the row and we\'re done.\\n2. When `y == 0` (we\'re iterating over the diagonals in the first row)\\n In this case, we bump the column and check if we haven\'t gotten out of bounds,\\n i.e. the end of the first row. If we get out of the bounds, we\'re continuing\\n with the second diagonal in the first column.\\n\\nDereferencing the iterator must \u201cyield\u201d something. In our case it will be the\\ndiagonal that we want to sort. For sorting we need just the iterators that can\\nmove around said diagonal. The simplest thing, we can do, is to delegate it to\\nsomething else. In our case it will be a class called `diagonal`.\\n\\n## Implementing the `diagonal` itself\\n\\nAfter implementing the iterator over diagonals, we know that all we need to describe\\na diagonal is the matrix itself and the \u201cstart\u201d of the diagonal (row and column).\\nAnd we also know that the diagonal must provide some iterators for the `std::sort`\\nfunction. We can start with the following skeleton:\\n\\n```cpp\\ntemplate \\nclass diagonal {\\n using matrix_t = std::vector>;\\n\\n matrix_t& matrix;\\n std::size_t x;\\n std::size_t y;\\n\\npublic:\\n diagonal(matrix_t& matrix, std::size_t x, std::size_t y)\\n : matrix(matrix)\\n , x(x)\\n , y(y)\\n {\\n }\\n\\n diagonal_iter begin() const { return diagonal_iter { matrix, x, y }; }\\n\\n diagonal_iter end() const\\n {\\n auto max_x = matrix[y].size();\\n auto max_y = matrix.size();\\n\\n // we need to find the distance in which we get out of bounds (either in\\n // column or row)\\n auto steps = std::min(max_x - x, max_y - y);\\n\\n return diagonal_iter { matrix, x + steps, y + steps };\\n }\\n};\\n```\\n\\nInitialization is rather simple, we just \u201ckeep\u201d the stuff we get, `begin` is the\\nsimplest, we just delegate.\\n\\nIn case of the `end`, it gets more complicated. We need to know where is the \u201cend\u201d\\nof the diagonal. Since `end` should point to the first element \u201cafter\u201d the iterable,\\nwe know that it\'s the first position of the iterator where either `y` becomes\\n`matrix.size()` or `x` becomes `matrix[y].size()`. Also we are moving along diagonal,\\nduh, therefore we can deduce the first \u201cposition\u201d afterwards by minimal amount of\\nsteps to get out of the any column or row, hence `std::min(max_x - x, max_y - y)`.\\nFinal position is then computed simply by adding the steps to the beginning of\\nthe diagonal.\\n\\nNow we just need to finish the iterator for the diagonal itself and we\'re done.\\n\\n### Implementing `diagonal_iter`\\n\\nThis part is the hardest from all we need to do. It\'s because of the requirements\\nof the `std::sort` that requires us to implement a _random access iterator_. I have\\nbriefly described it above, and \u201cin a nutshell\u201d it means that we need to implement\\nan iterator that can move in constant time along the diagonal in any amount of\\nsteps.\\n\\nLet\'s go through all of the functionality that our iterator needs to support to\\nbe used in `std::sort`. We need the usual operations like:\\n\\n- equality/inequality\\n- incrementation\\n- dereferencing\\n\\nWe will also add all the types that our iterator uses with the category of the\\niterator, i.e. what interface it supports:\\n\\n```cpp\\nclass diagonal_iter {\\n // we need to keep reference to the matrix itself\\n matrix_t& m;\\n\\n // we need to be able to tell our current position\\n std::size_t x;\\n std::size_t y;\\n\\npublic:\\n using difference_type = std::ptrdiff_t;\\n using value_type = T;\\n using pointer = T*;\\n using reference = T&;\\n using iterator_category = std::random_access_iterator_tag;\\n\\n diagonal_iter(matrix_t& matrix,\\n std::size_t x,\\n std::size_t y)\\n : m(matrix)\\n , x(x)\\n , y(y)\\n {\\n }\\n\\n bool operator==(const diagonal_iter& rhs) const\\n {\\n return x == rhs.x && y == rhs.y && m == rhs.m;\\n }\\n\\n diagonal_iter& operator++()\\n {\\n // we are moving along the diagonal, so we increment both \u2039x\u203a and \u2039y\u203a at\\n // the same time\\n x++;\\n y++;\\n return *this;\\n }\\n\\n reference operator*() const { return m[y][x]; }\\n};\\n```\\n\\nThis is pretty similar to the previous iterator, but now we need to implement the\\nremaining requirements of the _random access iterator_. Let\'s see what those are:\\n\\n- decrementation - cause we need to be able to move backwards too, since _random _\\n _access iterator_ extends the interface of _bidirectional iterator_\\n- moving the iterator in either direction by steps given as an integer\\n- being able to tell the distance between two iterators\\n- define an ordering on the iterators\\n\\nLet\'s fill them in:\\n\\n```cpp\\nclass diagonal_iter {\\n // we need to keep reference to the matrix itself\\n matrix_t& m;\\n\\n // we need to be able to tell our current position\\n std::size_t x;\\n std::size_t y;\\n\\npublic:\\n using difference_type = std::ptrdiff_t;\\n using value_type = T;\\n using pointer = T*;\\n using reference = T&;\\n using iterator_category = std::random_access_iterator_tag;\\n\\n diagonal_iter(matrix_t& matrix,\\n std::size_t x,\\n std::size_t y)\\n : m(matrix)\\n , x(x)\\n , y(y)\\n {\\n }\\n\\n bool operator==(const diagonal_iter& rhs) const\\n {\\n return x == rhs.x && y == rhs.y && m == rhs.m;\\n }\\n\\n diagonal_iter& operator++()\\n {\\n // we are moving along the diagonal, so we increment both \u2039x\u203a and \u2039y\u203a at\\n // the same time\\n x++;\\n y++;\\n return *this;\\n }\\n\\n reference operator*() const { return m[y][x]; }\\n\\n // exactly opposite to the incrementation\\n diagonal_iter operator--()\\n {\\n x--;\\n y--;\\n return *this;\\n }\\n\\n // moving \u2039n\u203a steps back is same as calling decrementation \u2039n\u203a-times, so we\\n // can just return a new iterator and subtract \u2039n\u203a from both coordinates in\\n // the matrix\\n diagonal_iter operator-(difference_type n) const\\n {\\n return diagonal_iter { m, x - n, y - n };\\n }\\n\\n // here we assume that we are given two iterators on the same diagonal\\n difference_type operator-(const diagonal_iter& rhs) const\\n {\\n assert(m == rhs.m);\\n return x - rhs.x;\\n }\\n\\n // counterpart of moving \u2039n\u203a steps backwards\\n diagonal_iter operator+(difference_type n) const\\n {\\n return diagonal_iter { m, x + n, y + n };\\n }\\n\\n // we compare the coordinates, and also assume that those 2 iterators are\\n // lying on the same diagonal\\n bool operator<(const diagonal_iter& rhs) const\\n {\\n assert(m == rhs.m);\\n return x < rhs.x && y < rhs.y;\\n }\\n};\\n```\\n\\nAt this point we could probably try and compile it, right? If we do so, we will\\nget yelled at by a compiler for the following reasons:\\n\\n```\\n/usr/bin/../lib/gcc/x86_64-redhat-linux/12/../../../../include/c++/12/bits/stl_algo.h:1792:11: error: object of type \'diagonal::diagonal_iter\' cannot be assigned because its copy assignment operator is implicitly deleted [clang-diagnostic-error]\\n __last = __next;\\n ^\\n/usr/bin/../lib/gcc/x86_64-redhat-linux/12/../../../../include/c++/12/bits/stl_algo.h:1817:11: note: in instantiation of function template specialization \'std::__unguarded_linear_insert::diagonal_iter, __gnu_cxx::__ops::_Val_less_iter>\' requested here\\n std::__unguarded_linear_insert(__i,\\n ^\\n/usr/bin/../lib/gcc/x86_64-redhat-linux/12/../../../../include/c++/12/bits/stl_algo.h:1849:9: note: in instantiation of function template specialization \'std::__insertion_sort::diagonal_iter, __gnu_cxx::__ops::_Iter_less_iter>\' requested here\\n std::__insertion_sort(__first, __first + int(_S_threshold), __comp);\\n ^\\n/usr/bin/../lib/gcc/x86_64-redhat-linux/12/../../../../include/c++/12/bits/stl_algo.h:1940:9: note: in instantiation of function template specialization \'std::__final_insertion_sort::diagonal_iter, __gnu_cxx::__ops::_Iter_less_iter>\' requested here\\n std::__final_insertion_sort(__first, __last, __comp);\\n ^\\n/usr/bin/../lib/gcc/x86_64-redhat-linux/12/../../../../include/c++/12/bits/stl_algo.h:4820:12: note: in instantiation of function template specialization \'std::__sort::diagonal_iter, __gnu_cxx::__ops::_Iter_less_iter>\' requested here\\n std::__sort(__first, __last, __gnu_cxx::__ops::__iter_less_iter());\\n ^\\nmatrix-sort.cpp:161:18: note: in instantiation of function template specialization \'std::sort::diagonal_iter>\' requested here\\n std::sort(d.begin(), d.end());\\n ^\\nmatrix-sort.cpp:17:19: note: copy assignment operator of \'diagonal_iter\' is implicitly deleted because field \'m\' is of reference type \'diagonal::matrix_t &\' (aka \'vector> &\')\\n matrix_t& m;\\n ^\\n/usr/bin/../lib/gcc/x86_64-redhat-linux/12/../../../../include/c++/12/bits/stl_algo.h:1830:2: error: no matching function for call to \'__unguarded_linear_insert\' [clang-diagnostic-error]\\n std::__unguarded_linear_insert(__i,\\n ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\\n/usr/bin/../lib/gcc/x86_64-redhat-linux/12/../../../../include/c++/12/bits/stl_algo.h:1850:9: note: in instantiation of function template specialization \'std::__unguarded_insertion_sort::diagonal_iter, __gnu_cxx::__ops::_Iter_less_iter>\' requested here\\n std::__unguarded_insertion_sort(__first + int(_S_threshold), __last,\\n ^\\n/usr/bin/../lib/gcc/x86_64-redhat-linux/12/../../../../include/c++/12/bits/stl_algo.h:1940:9: note: in instantiation of function template specialization \'std::__final_insertion_sort::diagonal_iter, __gnu_cxx::__ops::_Iter_less_iter>\' requested here\\n std::__final_insertion_sort(__first, __last, __comp);\\n ^\\n/usr/bin/../lib/gcc/x86_64-redhat-linux/12/../../../../include/c++/12/bits/stl_algo.h:4820:12: note: in instantiation of function template specialization \'std::__sort::diagonal_iter, __gnu_cxx::__ops::_Iter_less_iter>\' requested here\\n std::__sort(__first, __last, __gnu_cxx::__ops::__iter_less_iter());\\n ^\\nmatrix-sort.cpp:161:18: note: in instantiation of function template specialization \'std::sort::diagonal_iter>\' requested here\\n std::sort(d.begin(), d.end());\\n ^\\n/usr/bin/../lib/gcc/x86_64-redhat-linux/12/../../../../include/c++/12/bits/stl_algo.h:1782:5: note: candidate template ignored: substitution failure [with _RandomAccessIterator = diagonal::diagonal_iter, _Compare = __gnu_cxx::__ops::_Val_less_iter]\\n __unguarded_linear_insert(_RandomAccessIterator __last,\\n ^\\n/usr/bin/../lib/gcc/x86_64-redhat-linux/12/../../../../include/c++/12/bits/stl_algo.h:1923:11: error: object of type \'diagonal::diagonal_iter\' cannot be assigned because its copy assignment operator is implicitly deleted [clang-diagnostic-error]\\n __last = __cut;\\n ^\\n/usr/bin/../lib/gcc/x86_64-redhat-linux/12/../../../../include/c++/12/bits/stl_algo.h:1937:9: note: in instantiation of function template specialization \'std::__introsort_loop::diagonal_iter, long, __gnu_cxx::__ops::_Iter_less_iter>\' requested here\\n std::__introsort_loop(__first, __last,\\n ^\\n/usr/bin/../lib/gcc/x86_64-redhat-linux/12/../../../../include/c++/12/bits/stl_algo.h:4820:12: note: in instantiation of function template specialization \'std::__sort::diagonal_iter, __gnu_cxx::__ops::_Iter_less_iter>\' requested here\\n std::__sort(__first, __last, __gnu_cxx::__ops::__iter_less_iter());\\n ^\\nmatrix-sort.cpp:161:18: note: in instantiation of function template specialization \'std::sort::diagonal_iter>\' requested here\\n std::sort(d.begin(), d.end());\\n ^\\nmatrix-sort.cpp:17:19: note: copy assignment operator of \'diagonal_iter\' is implicitly deleted because field \'m\' is of reference type \'diagonal::matrix_t &\' (aka \'vector> &\')\\n matrix_t& m;\\n ^\\n```\\n\\nThat\'s a lot of noise, isn\'t it? Let\'s focus on the important parts:\\n\\n```\\n/usr/bin/../lib/gcc/x86_64-redhat-linux/12/../../../../include/c++/12/bits/stl_algo.h:1792:11: error: object of type \'diagonal::diagonal_iter\' cannot be assigned because its copy assignment operator is implicitly deleted [clang-diagnostic-error]\\n\u2026\\nmatrix-sort.cpp:17:19: note: copy assignment operator of \'diagonal_iter\' is implicitly deleted because field \'m\' is of reference type \'diagonal::matrix_t &\' (aka \'vector> &\')\\n matrix_t& m;\\n ^\\n```\\n\\nAh! We have a reference in our iterator, and this prevents us from having a copy\\nassignment operator (that is used \u201csomewhere\u201d in the sorting algorithm). Well\u2026\\nLet\'s just wrap it!\\n\\n```diff\\n# we need to keep a different type than reference\\n- matrix_t& m;\\n+ std::reference_wrapper m;\\n\\n# in comparison we need to get the reference out of the wrapper first\\n- return x == rhs.x && y == rhs.y && m == rhs.m;\\n+ return x == rhs.x && y == rhs.y && m.get() == rhs.m.get();\\n\\n# same when we return a reference to the \u201ccell\u201d in the matrix\\n- reference operator*() const { return m[y][x]; }\\n+ reference operator*() const { return m.get()[y][x]; }\\n\\n# and finally in the assertions that we set for the \u201cdistance\u201d and \u201cless than\u201d\\n- assert(m == rhs.m);\\n+ assert(m.get() == rhs.m.get());\\n```\\n\\nWe\'re done now! We have written an iterator over diagonals for a 2D `vector`. You can have a look at the final result [here](pathname:///files/blog/leetcode/sort-matrix-diagonally/matrix-sort.cpp).\\n\\n[^1]: just because I\'m used to it and don\'t care about your opinion ;)\\n[^2]: exercise at your own risk\\n[^3]: me in 5 minutes in fact, but don\'t make me scared\\n[^4]: me in the next section\u2026"},{"id":"aoc-2022/2nd-week","metadata":{"permalink":"/blog/aoc-2022/2nd-week","editUrl":"https://github.com/mfocko/blog/tree/main/blog/aoc-2022/02-week-2.md","source":"@site/blog/aoc-2022/02-week-2.md","title":"2nd week of Advent of Code \'22 in Rust","description":"Surviving second week in Rust.","date":"2022-12-25T23:15:00.000Z","formattedDate":"December 25, 2022","tags":[{"label":"advent-of-code","permalink":"/blog/tags/advent-of-code"},{"label":"advent-of-code-2022","permalink":"/blog/tags/advent-of-code-2022"},{"label":"rust","permalink":"/blog/tags/rust"}],"readingTime":20.875,"hasTruncateMarker":true,"authors":[{"name":"Matej Focko","email":"me+blog@mfocko.xyz","title":"a.k.a. @mf","url":"https://gitlab.com/mfocko","imageURL":"https://github.com/mfocko.png","key":"mf"}],"frontMatter":{"title":"2nd week of Advent of Code \'22 in Rust","description":"Surviving second week in Rust.","date":"2022-12-25T23:15","slug":"aoc-2022/2nd-week","authors":"mf","tags":["advent-of-code","advent-of-code-2022","rust"],"hide_table_of_contents":false},"unlisted":false,"prevItem":{"title":"Sort the matrix diagonally","permalink":"/blog/leetcode/sort-diagonally"},"nextItem":{"title":"1st week of Advent of Code \'22 in Rust","permalink":"/blog/aoc-2022/1st-week"}},"content":"Let\'s go through the second week of [_Advent of Code_] in Rust.\\n\\n\x3c!--truncate--\x3e\\n\\n## [Day 8: Treetop Tree House](https://adventofcode.com/2022/day/8)\\n\\n:::info tl;dr\\n\\nWe get a forest and we want to know how many trees are visible from the outside.\\nApart from that we want to find the best view.\\n\\n:::\\n\\nNothing interesting. We are moving around 2D map though. And indexing can get a\\nbit painful when doing so, let\'s refactor it a bit ;) During the preparation for\\nthe AoC, I have written `Vector2D` and now it\'s time to extend it with indexing\\nof `Vec` of `Vec`s. In my solution I was manipulating with indices in the following\\nway:\\n\\n- swapping them\\n- checking whether they are correct indices for the `Vec>`\\n- indexing `Vec>` with them\\n\\n:::caution\\n\\nI\'m getting familiar with Rust and starting to \u201cabuse\u201d it\u2026 While doing so, I\'m\\nalso uncovering some \u201cfeatures\u201d that I don\'t really like. Therefore I will mark\\nall of my rants with _thicc_ **\xab\u21af\xbb** mark and will try to \u201clock\u201d them into their\\nown \u201cbox of hell\u201d.\\n\\n:::\\n\\n#### Swapping indices\\n\\nRelatively simple implementation, just take the values, swap them and return new\\nvector.\\n\\n```rust\\nimpl Vector2D {\\n pub fn swap(&self) -> Self {\\n Self {\\n x: self.y,\\n y: self.x,\\n }\\n }\\n}\\n```\\n\\nPretty straight-forward implementation, but let\'s talk about the `T: Copy`. We\\nneed to use it, since we are returning a **new** vector, with swapped **values**.\\nIf we had values that cannot be copied, the only thing we could do, would be a\\nvector of references (and it would also introduce a lifetime, to which we\'ll get\\nlater on). This is pretty similar with the operations on sets from the first week.\\n\\n#### Indexing `Vec`\\n\\nI will start with the indexing, cause bound-checking is a bit more\u2026 complicated\\nthan I would like to.\\n\\n```rust\\npub fn index<\'a, T, U>(v: &\'a [Vec], idx: &Vector2D) -> &\'a U\\nwhere\\n usize: TryFrom,\\n >::Error: Debug,\\n T: Copy,\\n{\\n let (x, y): (usize, usize) = (idx.x.try_into().unwrap(), idx.y.try_into().unwrap());\\n &v[y][x]\\n}\\n```\\n\\nLet\'s talk about this mess\u2026 Body of the function is probably the most easy part\\nand should not be hard to understand, we just take the `x` and `y` and convert\\nthem both to `usize` type that can be used later on for indexing.\\n\\nThe type signature of the function is where the fun is at :wink: We are trying\\nto convert unknown type to `usize`, so we must bound the `T` as a type that can\\nbe converted to `usize`, that\'s how we got `usize: TryFrom` which basically\\nsays that `usize` must implement `TryFrom` trait and therefore allows us to\\nconvert the indices to actual `usize` indices. Using `.unwrap()` also forces us\\nto bound the error that can occur when converting `T` into `usize`, that\'s how\\nwe get `>::Error: Debug` which loosely means\\n\\n> error during conversion of `T` into `usize` must implement `Debug`,\\n> i.e. can be printed in some way or other\\n\\n`T: Copy` is required by `.try_into()` which takes `T` by-value.\\n\\nAnd now we are left only with the first line of the definition.\\n\\n:::note\\n\\nSkilled Rustaceans might notice that this implementation is rather flaky and can\\nbreak in multiple places at once. I\'ll get back to it\u2026\\n\\n:::\\n\\nLet\'s split it in multiple parts:\\n\\n- `v: &\'a [Vec]` represents the 2D `Vec`, we are indexing, `Vec` implements\\n `Slice` trait and _clippy_ recommends using `&[T]` to `&Vec`, exact details\\n are unknown to me\\n- `idx: &Vector2D` represents the _indices_ which we use, we take them by\\n reference to avoid an unnecessary copy\\n- `-> &\'a U` means that we are returning a _reference_ to some value of type `U`.\\n Now the question is what does the `\'a` mean, we can also see it as a generic\\n type declared along `T` and `U`. And the answer is _relatively_ simple, `\'a`\\n represents a _lifetime_. We take the `v` by a reference and return a reference,\\n borrow checker validates all of the _borrows_ (or references), so we need to\\n specify that our returned value has _the same lifetime_ as the vector we have\\n taken by a reference, i.e. returned reference must live at least as long as the\\n `v`. This way we can \u201cbe sure\u201d that the returned reference is valid.\\n\\n##### Issues\\n\\nFirst issue that our implementation has is the fact that we cannot get a mutable\\nreference out of that function. This could be easily resolved by introducing new\\nfunction, e.g. `index_mut`. Which I have actually done while writing this part:\\n\\n```rust\\npub fn index_mut<\'a, T, U>(v: &\'a mut [Vec], idx: &Vector2D) -> &\'a mut U\\nwhere\\n usize: TryFrom,\\n >::Error: Debug,\\n T: Copy,\\n{\\n let (x, y): (usize, usize) = (idx.x.try_into().unwrap(), idx.y.try_into().unwrap());\\n &mut v[y][x]\\n}\\n```\\n\\n:::caution **\xab\u21af\xbb** Why can\'t we use one function?\\n\\nWhen we consider a `Vec`, we don\'t need to consider containers as `T`, Rust\\nimplements indexing as traits `Index` and `IndexMut` that do the dirty work\\nbehind syntactic sugar of `container[idx]`.\\n\\nHowever, implementing of traits is not allowed for _external_ types, i.e. types\\nthat you haven\'t defined yourself. This means that you can implement indexing\\nover containers that you have implemented yourself, but you cannot use your own\\ntypes for indexing \u201cbuilt-in\u201d types.\\n\\nAnother part of this rabbit hole is trait `SliceIndex` that is of a relevance\\nbecause of\\n\\n```rust\\nimpl Index for [T]\\nwhere\\n I: SliceIndex<[T]>\\n\\nimpl Index for Vec\\nwhere\\n I: SliceIndex<[T]>,\\n A: Allocator\\n\\nimpl Index for [T; N]\\nwhere\\n [T]: Index\\n```\\n\\nIn other words, if your type implements `SliceIndex` trait, it can be used\\nfor indexing. As of now, this trait has all of its required methods experimental\\nand is marked as `unsafe`.\\n\\n:::\\n\\nAnother problem is a requirement for indexing either `[Vec]` or `Vec>`.\\nThis requirement could be countered by removing inner type `Vec` and constraining\\nit by a trait `Index` (or `IndexMut` respectively) in a following way\\n\\n```rust\\npub fn index<\'a, C, T>(v: &\'a [C], idx: &Vector2D) -> &\'a C::Output\\nwhere\\n usize: TryFrom,\\n >::Error: Debug,\\n T: Copy,\\n C: Index\\n{\\n let (x, y): (usize, usize) = (idx.x.try_into().unwrap(), idx.y.try_into().unwrap());\\n &v[y][x]\\n}\\n```\\n\\nGiven this, we can also give a more meaningful typename for indexing type, such\\nas `I`.\\n\\n#### Checking bounds\\n\\nNow we can get to the boundary checks, it is very similar, but a more\u2026 dirty.\\nFirst approach that came up was to convert the indices in `Vector2D` to `usize`,\\nbut when you add the indices up, e.g. when checking the neighbors, you can end\\nup with negative values which, unlike in C++, causes an error (instead of underflow\\nthat you can use to your advantage; you can easily guess how).\\n\\nSo how can we approach this then? Well\u2026 we will convert the bounds instead of\\nthe indices and that lead us to:\\n\\n```rust\\npub fn in_range(v: &[Vec], idx: &Vector2D) -> bool\\nwhere\\n usize: TryInto,\\n >::Error: Debug,\\n T: PartialOrd + Copy,\\n{\\n idx.y >= 0.try_into().unwrap()\\n && idx.y < v.len().try_into().unwrap()\\n && idx.x >= 0.try_into().unwrap()\\n && idx.x\\n < v[TryInto::::try_into(idx.y).unwrap()]\\n .len()\\n .try_into()\\n .unwrap()\\n}\\n```\\n\\nYou can tell that it\'s definitely a shitty code. Let\'s improve it now! We will\\nget back to the original idea, but do it better. We know that we cannot convert\\nnegative values into `usize`, **but** we also know that conversion like that\\nreturns a `Result` which we can use to our advantage.\\n\\n```rust\\npub fn in_range(v: &[Vec], idx: &Vector2D) -> bool\\nwhere\\n T: Copy,\\n usize: TryFrom,\\n{\\n usize::try_from(idx.y)\\n .and_then(|y| usize::try_from(idx.x).map(|x| y < v.len() && x < v[y].len()))\\n .unwrap_or(false)\\n}\\n```\\n\\n`Result` is a type similar to `Either` in Haskell and it allows us to chain\\nmultiple operations on correct results or propagate the original error without\\ndoing anything. Let\'s dissect it one-by-one.\\n\\n`try_from` is a method implemented in `TryFrom` trait, that allows you to convert\\ntypes and either successfully convert them or fail (with a reasonable error). This\\nmethod returns `Result`.\\n\\nWe call `and_then` on that _result_, let\'s have a look at the type signature of\\n`and_then`, IMO it explains more than enough:\\n\\n```rust\\npub fn and_then(self, op: F) -> Result\\nwhere\\n F: FnOnce(T) -> Result\\n```\\n\\nOK\u2026 So it takes the result and a function and returns another result with\\ndifferent value and different error. However we can see that the function, which\\nrepresents an operation on a result, takes just the value, i.e. it doesn\'t care\\nabout any previous error. To make it short:\\n\\n> `and_then` allows us to run an operation, which can fail, on the correct result\\n\\nWe parsed a `y` index and now we try to convert the `x` index with `try_from`\\nagain, but on that result we use `map` rather than `and_then`, why would that be?\\n\\n```rust\\npub fn map(self, op: F) -> Result\\nwhere\\n F: FnOnce(T) -> U\\n```\\n\\nHuh\u2026 `map` performs an operation that **cannot** fail. And finally we use\\n`unwrap_or` which takes the value from result, or in case of an error returns the\\ndefault that we define.\\n\\nHow does this work then? If `y` is negative, the conversion fails and the error\\npropagates all the way to `unwrap_or`, if `y` can be a correct `usize` value, then\\nwe do the same with `x`. If `x` is negative, we propagate the error as with `y`,\\nand if it\'s not, then we check whether it exceeds the higher bounds or not.\\n\\n### Solution\\n\\nRelatively simple, you just need follow the rules and not get too smart, otherwise\\nit will get back at you.\\n\\n## [Day 9: Rope Bridge](https://adventofcode.com/2022/day/9)\\n\\n:::info tl;dr\\n\\nWe get a rope with knots and we want to track how many different positions are\\nvisited with the rope\'s tail.\\n\\n:::\\n\\nBy this day, I have come to a conclusion that current skeleton for each day\\ngenerates a lot of boilerplate. And even though it can be easily copied, it\'s\\njust a waste of space and unnecessary code. Let\'s \u201csimplify\u201d this (on one end\\nwhile creating monster on the other end). I\'ve gone through what we need in the\\npreparations for the AoC. Let\'s sum up our requirements:\\n\\n- parsing\\n- part 1 & 2\\n- running on sample / input\\n- tests\\n\\nParsing and implementation of both parts is code that changes each day and we\\ncannot do anything about it. However running and testing can be simplified!\\n\\nLet\'s introduce and export a new module `solution` that will take care of all of\\nthis. We will start by introducing a trait for each day.\\n\\n```rust\\npub trait Solution {\\n fn parse_input>(pathname: P) -> Input;\\n\\n fn part_1(input: &Input) -> Output;\\n fn part_2(input: &Input) -> Output;\\n}\\n```\\n\\nThis does a lot of work for us already, we have defined a trait and for each day\\nwe will create a structure representing a specific day. That structure will also\\nimplement the `Solution` trait.\\n\\nNow we need to get rid of the boilerplate, we can\'t get rid of the `main` function,\\nbut we can at least move out the functionality.\\n\\n```rust\\nfn run(type_of_input: &str) -> Result<()>\\nwhere\\n Self: Sized,\\n{\\n tracing_subscriber::fmt()\\n .with_env_filter(EnvFilter::from_default_env())\\n .with_target(false)\\n .with_file(true)\\n .with_line_number(true)\\n .without_time()\\n .compact()\\n .init();\\n color_eyre::install()?;\\n\\n let input = Self::parse_input(format!(\\"{}s/{}.txt\\", type_of_input, Self::day()));\\n\\n info!(\\"Part 1: {}\\", Self::part_1(&input));\\n info!(\\"Part 2: {}\\", Self::part_2(&input));\\n\\n Ok(())\\n}\\n\\nfn main() -> Result<()>\\nwhere\\n Self: Sized,\\n{\\n Self::run(\\"input\\")\\n}\\n```\\n\\nThis is all part of the `Solution` trait, which can implement methods while being\\ndependent on what is provided by the implementing types. In this case, we just\\nneed to bound the `Output` type to implement `Display` that is necessary for the\\n`info!` and format string there.\\n\\nNow we can get to first of the nasty things we are going to do\u2026 And it is the\\n`day()` method that you can see being used when constructing path to the input\\nfile. That method will generate a name of the file, e.g. `day01` and we know that\\nwe can _somehow_ deduce it from the structure name, given we name it reasonably.\\n\\n```rust\\nfn day() -> String {\\n let mut day = String::from(type_name::().split(\\"::\\").next().unwrap());\\n day.make_ascii_lowercase();\\n\\n day.to_string()\\n}\\n```\\n\\n:::caution `type_name`\\n\\nThis feature is still experimental and considered to be internal, it is not\\nadvised to use it any production code.\\n\\n:::\\n\\nAnd now we can get to the nastiest stuff :weary: We will **generate** the tests!\\n\\nWe want to be able to generate tests for sample input in a following way:\\n\\n```rust\\ntest_sample!(day_01, Day01, 42, 69);\\n```\\n\\nThere\'s not much we can do, so we will write a macro to generate the tests for us.\\n\\n```rust\\n#[macro_export]\\nmacro_rules! test_sample {\\n ($mod_name:ident, $day_struct:tt, $part_1:expr, $part_2:expr) => {\\n #[cfg(test)]\\n mod $mod_name {\\n use super::*;\\n\\n #[test]\\n fn test_part_1() {\\n let sample =\\n $day_struct::parse_input(&format!(\\"samples/{}.txt\\", $day_struct::day()));\\n assert_eq!($day_struct::part_1(&sample), $part_1);\\n }\\n\\n #[test]\\n fn test_part_2() {\\n let sample =\\n $day_struct::parse_input(&format!(\\"samples/{}.txt\\", $day_struct::day()));\\n assert_eq!($day_struct::part_2(&sample), $part_2);\\n }\\n }\\n };\\n}\\n```\\n\\nWe have used it in a similar way as macros in C/C++, one of the things that we\\ncan use to our advantage is defining \u201ctype\u201d of the parameters for the macro. All\\nparameters have their name prefixed with `$` sign and you can define various \u201cforms\u201d\\nof your macro. Let\'s go through it!\\n\\nWe have following parameters:\\n\\n- `$mod_name` which represents the name for the module with tests, it is typed\\n with `ident` which means that we want a valid identifier to be passed in.\\n- `$day_struct` represents the structure that will be used for tests, it is typed\\n with `tt` which represents a _token tree_, in our case it is a type.\\n- `$part_X` represents the expected output for the `X`th part and is of type `expr`\\n which literally means an _expression_.\\n\\nApart from that we need to use `#[macro_export]` to mark the macro as exported\\nfor usage outside of the module. Now our skeleton looks like:\\n\\n```rust\\nuse aoc_2022::*;\\n\\ntype Input = String;\\ntype Output = String;\\n\\nstruct DayXX;\\nimpl Solution for DayXX {\\n fn parse_input>(pathname: P) -> Input {\\n file_to_string(pathname)\\n }\\n\\n fn part_1(input: &Input) -> Output {\\n todo!()\\n }\\n\\n fn part_2(input: &Input) -> Output {\\n todo!()\\n }\\n}\\n\\nfn main() -> Result<()> {\\n // DayXX::run(\\"sample\\")\\n DayXX::main()\\n}\\n\\n// test_sample!(day_XX, DayXX, , );\\n```\\n\\n### Solution\\n\\nNot much to talk about, it is relatively easy to simulate.\\n\\n## [Day 10: Cathode-Ray Tube](https://adventofcode.com/2022/day/10)\\n\\n:::info tl;dr\\n\\nEmulating basic arithmetic operations on a CPU and drawing on CRT based on the\\nCPU\'s accumulator.\\n\\n:::\\n\\nIn this day I have discovered an issue with my design of the `Solution` trait.\\nAnd the issue is caused by different types of `Output` for the part 1 and part 2.\\n\\nProblem is relatively simple and consists of simulating a CPU, I have approached\\nit in a following way:\\n\\n```rust\\nfn evaluate_instructions(instructions: &[Instruction], mut out: Output) -> Output {\\n instructions\\n .iter()\\n .fold(State::new(), |state, instruction| {\\n state.execute(instruction, &mut out)\\n });\\n\\n out\\n}\\n```\\n\\nWe just take the instructions, we have some state of the CPU and we execute the\\ninstructions one-by-one. Perfect usage of the `fold` (or `reduce` as you may know\\nit from other languages).\\n\\nYou can also see that we have an `Output` type, so the question is how can we fix\\nthat problem. And the answer is very simple and _functional_. Rust allows you to\\nhave an `enumeration` that can _bear_ some other values apart from the type itself.\\n\\n:::tip\\n\\nWe could\'ve seen something like this with the `Result` type that can be\\ndefined as\\n\\n```rust\\nenum Result {\\n Ok(T),\\n Err(E)\\n}\\n```\\n\\n###### What does that mean though?\\n\\nWhen we have an `Ok` value, it has the result itself, and when we get an `Err`\\nvalue, it has the error. This also allows us to handle _results_ in a rather\\npretty way:\\n\\n```rust\\nmatch do_something(x) {\\n Ok(y) => {\\n println!(\\"SUCCESS: {}\\", y);\\n },\\n Err(y) => {\\n eprintln!(\\"ERROR: {}\\", y);\\n }\\n}\\n```\\n\\n:::\\n\\nMy solution has a following outline:\\n\\n```rust\\nfn execute(&self, i: &Instruction, output: &mut Output) -> State {\\n // execute the instruction\\n\\n // collect results if necessary\\n match output {\\n Output::Part1(x) => self.execute_part_1(y, x),\\n Output::Part2(x) => self.execute_part_2(y, x),\\n }\\n\\n // return the obtained state\\n new_state\\n}\\n```\\n\\nYou might think that it\'s a perfectly reasonable thing to do. Yes, **but** notice\\nthat the `match` statement doesn\'t _collect_ the changes in any way and also we\\npass `output` by `&mut`, so it is shared across each _iteration_ of the `fold`.\\n\\nThe dirty and ingenious thing is that `x`s are passed by `&mut` too and therefore\\nthey are directly modified by the helper functions. To sum it up and let it sit\\n\\n> We are **collecting** the result **into** an **enumeration** that is **shared**\\n> across **all** iterations of `fold`.\\n\\n### Solution\\n\\nSimilar to _Day 9_, but there are some technical details that can get you.\\n\\n## [Day 11: Monkey in the Middle](https://adventofcode.com/2022/day/11)\\n\\n:::info tl;dr\\n\\nSimulation of monkeys throwing stuff around and measuring your stress levels\\nwhile your stuff is being passed around.\\n\\n:::\\n\\nI think I decided to use regular expressions here for the first time, cause\\nparsing the input was a pain.\\n\\nAlso I didn\'t expect to implement Euclidean algorithm in Rust\u2026\\n\\n### Solution\\n\\nAgain, we\'re just running a simulation. Though I must admit it was very easy to\\nmake a small technical mistakes that could affect the final results very late.\\n\\n## [Day 12: Hill Climbing Algorithm](https://adventofcode.com/2022/day/12)\\n\\n:::info tl;dr\\n\\nFinding shortest path up the hill and also shortest path down to the ground while\\nalso rolling down the hill\u2026\\n\\n:::\\n\\nAs I have said in the _tl;dr_, we are looking for the shortest path, but the start\\nand goal differ for the part 1 and 2. So I have decided to refactor my solution\\nto a BFS algorithm that takes necessary parameters via functions:\\n\\n```rust\\nfn bfs(\\n graph: &[Vec], start: &Position, has_edge: F, is_target: G\\n) -> Option\\nwhere\\n F: Fn(&[Vec], &Position, &Position) -> bool,\\n G: Fn(&[Vec], &Position) -> bool\\n```\\n\\nWe pass the initial vertex from the caller and everything else is left to the BFS\\nalgorithm, based on the `has_edge` and `is_target` functions.\\n\\nThis was easy! And that is not very usual in Rust once you want to pass around\\nfunctions. :eyes:\\n\\n### Solution\\n\\nLooking for the shortest path\u2026 Must be Dijkstra, right? **Nope!** Half of the\\nReddit got jebaited though. In all fairness, nothing stops you from implementing\\nthe Dijkstra\'s algorithm for finding the shortest path, **but** if you know that\\nall connected vertices are in a unit (actually $d = 1$) distance from each other,\\nthen you know that running Dijkstra is equivalent to running BFS, only with worse\\ntime complexity, because of the priority heap instead of the queue.\\n\\n## [Day 13: Distress Signal](https://adventofcode.com/2022/day/13)\\n\\n:::info tl;dr\\n\\nProcessing packets with structured data from the distress signal.\\n\\n:::\\n\\nYou can implement a lot of traits if you want to. It is _imperative_ to implement\\nordering on the packets. I had a typo, so I also proceeded to implement a `Display`\\ntrait for debugging purposes:\\n\\n```rust\\nimpl Display for Packet {\\n fn fmt(&self, f: &mut std::fmt::Formatter<\'_>) -> std::fmt::Result {\\n match self {\\n Packet::Integer(x) => write!(f, \\"{x}\\"),\\n Packet::List(lst) => write!(f, \\"[{}]\\", lst.iter().map(|p| format!(\\"{p}\\")).join(\\",\\")),\\n }\\n }\\n}\\n```\\n\\n### Solution\\n\\nA lot of technical details\u2026 Parsing is nasty too\u2026\\n\\n## [Day 14: Regolith Reservoir](https://adventofcode.com/2022/day/14)\\n\\n:::info tl;dr\\n\\nLet\'s simulate falling sand grain-by-grain.\\n\\n:::\\n\\nAgain, both parts are relatively similar with minimal changes, so it is a good\\nidea to refactor it a bit. Similar approach to the [BFS above]. Also this is the\\nfirst day where I ran into efficiency issues and had to redo my solution to speed\\nit up just a bit.\\n\\n### Solution\\n\\nTedious.\\n\\n## Post Mortem\\n\\n### Indexing\\n\\nI was asked about the indexing after publishing the blog. And truly it is rather\\ncomplicated topic, especially after releasing `SliceIndex` trait. I couldn\'t\\nleave it be, so I tried to implement the `Index` and `IndexMut` trait.\\n\\n:::note\\n\\nI have also mentioned that the `SliceIndex` trait is `unsafe`, but truth be told,\\nonly _unsafe_ part are the 2 methods that are named `*unchecked*`. Anyways, I will\\nbe implementing the `Index*` traits for now, rather than the `SliceIndex`.\\n\\n:::\\n\\nIt\'s relatively straightforward\u2026\\n\\n```rust\\nimpl Index> for [C]\\nwhere\\n I: Copy + TryInto,\\n >::Error: Debug,\\n C: Index,\\n{\\n type Output = C::Output;\\n\\n fn index(&self, index: Vector2D) -> &Self::Output {\\n let (x, y): (usize, usize) =\\n (index.x.try_into().unwrap(), index.y.try_into().unwrap());\\n &self[y][x]\\n }\\n}\\n\\nimpl IndexMut> for [C]\\nwhere\\n I: Copy + TryInto,\\n >::Error: Debug,\\n C: IndexMut,\\n{\\n fn index_mut(&mut self, index: Vector2D) -> &mut Self::Output {\\n let (x, y): (usize, usize) =\\n (index.x.try_into().unwrap(), index.y.try_into().unwrap());\\n &mut self[y][x]\\n }\\n}\\n```\\n\\nWe can see a lot of similarities to the implementation of `index` and `index_mut`\\nfunctions. In the end, they are 1:1, just wrapped in the trait that provides a\\nsyntax sugar for `container[idx]`.\\n\\n:::note\\n\\nI have also switched from using the `TryFrom` to `TryInto` trait, since it better\\nmatches what we are using, the `.try_into` rather than `usize::try_from`.\\n\\nAlso implementing `TryFrom` automatically provides you with a `TryInto` trait,\\nsince it is relatively easy to implement. Just compare the following:\\n\\n```rust\\npub trait TryFrom: Sized {\\n type Error;\\n\\n fn try_from(value: T) -> Result;\\n}\\n\\npub trait TryInto: Sized {\\n type Error;\\n\\n fn try_into(self) -> Result;\\n}\\n```\\n\\n:::\\n\\nOK, so we have our trait implemented, we should be able to use `container[index]`,\\nright? Yes\u2026 but actually no :frowning:\\n\\n```\\nerror[E0277]: the type `[std::vec::Vec]` cannot be indexed by `aoc_2022::Vector2D`\\n --\x3e src/bin/day08.rs:26:18\\n |\\n26 | if trees[pos] > tallest {\\n | ^^^ slice indices are of type `usize` or ranges of `usize`\\n |\\n = help: the trait `std::slice::SliceIndex<[std::vec::Vec]>` is not implemented for `aoc_2022::Vector2D`\\n = note: required for `std::vec::Vec>` to implement `std::ops::Index>`\\n\\nerror[E0277]: the type `[std::vec::Vec]` cannot be indexed by `aoc_2022::Vector2D`\\n --\x3e src/bin/day08.rs:30:28\\n |\\n30 | max(tallest, trees[pos])\\n | ^^^ slice indices are of type `usize` or ranges of `usize`\\n |\\n = help: the trait `std::slice::SliceIndex<[std::vec::Vec]>` is not implemented for `aoc_2022::Vector2D`\\n = note: required for `std::vec::Vec>` to implement `std::ops::Index>`\\n\\nerror[E0277]: the type `[std::vec::Vec]` cannot be indexed by `aoc_2022::Vector2D`\\n --\x3e src/bin/day08.rs:52:28\\n |\\n52 | let max_height = trees[position];\\n | ^^^^^^^^ slice indices are of type `usize` or ranges of `usize`\\n |\\n = help: the trait `std::slice::SliceIndex<[std::vec::Vec]>` is not implemented for `aoc_2022::Vector2D`\\n = note: required for `std::vec::Vec>` to implement `std::ops::Index>`\\n```\\n\\nWhy? We have it implemented for the slices (`[C]`), why doesn\'t it work? Well,\\nthe fun part consists of the fact that in other place, where we were using it,\\nwe were passing the `&[Vec]`, but this is coming from a helper functions that\\ntake `&Vec>` instead. And\u2026 we don\'t implement `Index` and `IndexMut` for\\nthose. Just for the slices. \ud83e\udd2f _What are we going to do about it?_\\n\\nWe can either start copy-pasting or be smarter about it\u2026 I choose to be smarter,\\nso let\'s implement a macro! The only difference across the implementations are\\nthe types of the outer containers. Implementation doesn\'t differ **at all**!\\n\\nImplementing the macro can be done in a following way:\\n\\n```rust\\nmacro_rules! generate_indices {\\n ($container:ty) => {\\n impl Index> for $container\\n where\\n I: Copy + TryInto,\\n >::Error: Debug,\\n C: Index,\\n {\\n type Output = C::Output;\\n\\n fn index(&self, index: Vector2D) -> &Self::Output {\\n let (x, y): (usize, usize) =\\n (index.x.try_into().unwrap(), index.y.try_into().unwrap());\\n &self[y][x]\\n }\\n }\\n\\n impl IndexMut> for $container\\n where\\n I: Copy + TryInto,\\n >::Error: Debug,\\n C: IndexMut,\\n {\\n fn index_mut(&mut self, index: Vector2D) -> &mut Self::Output {\\n let (x, y): (usize, usize) =\\n (index.x.try_into().unwrap(), index.y.try_into().unwrap());\\n &mut self[y][x]\\n }\\n }\\n };\\n}\\n```\\n\\nAnd now we can simply do\\n\\n```rust\\ngenerate_indices!(VecDeque);\\ngenerate_indices!([C]);\\ngenerate_indices!(Vec);\\n// generate_indices!([C; N], const N: usize);\\n```\\n\\nThe last type (I took the inspiration from the implementations of the `Index` and\\n`IndexMut` traits) is a bit problematic, because of the `const N: usize` part,\\nwhich I haven\'t managed to be able to parse. And that\'s how I got rid of the error.\\n\\n:::note\\n\\nIf I were to use 2D-indexing over `[C; N]` slices, I\'d probably just go with the\\ncopy-paste, cause the cost of this \u201cmonstrosity\u201d outweighs the benefits of no DRY.\\n\\n:::\\n\\n#### Cause of the problem\\n\\nThis issue is relatively funny. If you don\'t use any type aliases, just the raw\\ntypes, you\'ll get suggested certain changes by the _clippy_. For example if you\\nconsider the following piece of code\\n\\n```rust\\nfn get_sum(nums: &Vec) -> i32 {\\n nums.iter().sum()\\n}\\n\\nfn main() {\\n let nums = vec![1, 2, 3];\\n println!(\\"Sum: {}\\", get_sum(&nums));\\n}\\n```\\n\\nand you run _clippy_ on it, you will get\\n\\n```\\nChecking playground v0.0.1 (/playground)\\nwarning: writing `&Vec` instead of `&[_]` involves a new object where a slice will do\\n --\x3e src/main.rs:1:18\\n |\\n1 | fn get_sum(nums: &Vec) -> i32 {\\n | ^^^^^^^^^ help: change this to: `&[i32]`\\n |\\n = help: for further information visit https://rust-lang.github.io/rust-clippy/master/index.html#ptr_arg\\n = note: `#[warn(clippy::ptr_arg)]` on by default\\n\\nwarning: `playground` (bin \\"playground\\") generated 1 warning\\n Finished dev [unoptimized + debuginfo] target(s) in 0.61s\\n```\\n\\nHowever, if you introduce a type alias, such as\\n\\n```rust\\ntype Numbers = Vec;\\n```\\n\\nThen _clippy_ won\'t say anything, cause there is literally nothing to suggest.\\nHowever the outcome is not the same\u2026\\n\\n[_advent of code_]: https://adventofcode.com\\n[bfs above]: #day-12-hill-climbing-algorithm"},{"id":"aoc-2022/1st-week","metadata":{"permalink":"/blog/aoc-2022/1st-week","editUrl":"https://github.com/mfocko/blog/tree/main/blog/aoc-2022/01-week-1.md","source":"@site/blog/aoc-2022/01-week-1.md","title":"1st week of Advent of Code \'22 in Rust","description":"Surviving first week in Rust.","date":"2022-12-15T01:15:00.000Z","formattedDate":"December 15, 2022","tags":[{"label":"advent-of-code","permalink":"/blog/tags/advent-of-code"},{"label":"advent-of-code-2022","permalink":"/blog/tags/advent-of-code-2022"},{"label":"rust","permalink":"/blog/tags/rust"}],"readingTime":12.4,"hasTruncateMarker":true,"authors":[{"name":"Matej Focko","email":"me+blog@mfocko.xyz","title":"a.k.a. @mf","url":"https://gitlab.com/mfocko","imageURL":"https://github.com/mfocko.png","key":"mf"}],"frontMatter":{"title":"1st week of Advent of Code \'22 in Rust","description":"Surviving first week in Rust.","date":"2022-12-15T01:15","slug":"aoc-2022/1st-week","authors":"mf","tags":["advent-of-code","advent-of-code-2022","rust"],"hide_table_of_contents":false},"unlisted":false,"prevItem":{"title":"2nd week of Advent of Code \'22 in Rust","permalink":"/blog/aoc-2022/2nd-week"},"nextItem":{"title":"Advent of Code \'22 in Rust","permalink":"/blog/aoc-2022/intro"}},"content":"Let\'s go through the first week of [_Advent of Code_] in Rust.\\n\\n\x3c!--truncate--\x3e\\n\\n:::note\\n\\nIf you wish to have a look at the solutions, you can follow them on my [GitLab].\\nMore specifically in the [`/src/bin/`].\\n\\n:::\\n\\nI will try to summarize my experience with using Rust for the AoC. Trying it out\\nages ago, I believe it will be _pain and suffering_, but we will see. For each\\nday I will also try to give a tl;dr of the problem, so that you can better imagine\\nthe relation to my woes or :+1: moments.\\n\\n## [Day 1: Calorie Counting](https://adventofcode.com/2022/day/1)\\n\\n:::info tl;dr\\n\\nAs the name suggests, we get the calories of the food contained in the elves\\nbackpacks and we want to choose the elf that has the most food ;)\\n\\n:::\\n\\n> Wakey wakey!\\n\\nProgramming in Rust at 6am definitely hits. I\'ve also forgotten to mention how I\\nhandle samples. With each puzzle you usually get a sample input and expected\\noutput. You can use them to verify that your solution works, or usually doesn\'t.\\n\\nAt first I\'ve decided to put asserts into my `main`, something like\\n\\n```rust\\nassert_eq!(part_1(&sample), 24000);\\ninfo!(\\"Part 1: {}\\", part_1(&input));\\n\\nassert_eq!(part_2(&sample), 45000);\\ninfo!(\\"Part 2: {}\\", part_2(&input));\\n```\\n\\nHowever, once you get further, the sample input may take some time to run itself.\\nSo in the end, I have decided to turn them into unit tests:\\n\\n```rust\\n#[cfg(test)]\\nmod tests {\\n use super::*;\\n\\n #[test]\\n fn test_part_1() {\\n let sample = parse_input(\\"samples/day01.txt\\");\\n assert_eq!(part_1(&sample), 24000);\\n }\\n\\n #[test]\\n fn test_part_2() {\\n let sample = parse_input(\\"samples/day01.txt\\");\\n assert_eq!(part_2(&sample), 45000);\\n }\\n}\\n```\\n\\nAnd later on I have noticed, it\'s hard to tell the difference between the days,\\nso I further renamed the `mod` from generic `tests` to reflect the days.\\n\\nAlso after finishing the first day puzzle, I have installed an [`sccache`] to\\ncache the builds, so that the build time is lower, cause it was kinda unbearable.\\n\\n### Solution\\n\\nWell, it\'s a pretty simple problem. You just take the input, sum the calories and\\nfind the biggest one. However, if we try to generalize to more than the biggest\\none, the fun appears. We have few options:\\n\\n- keep all the calories, sort them, take what we need\\n- keep all the calories and use max heap\\n- use min heap and maintain at most N calories that we need\\n\\n## [Day 2: Rock Paper Scissors](https://adventofcode.com/2022/day/2)\\n\\n:::info tl;dr\\n\\nYou want to know what score did you achieve while playing _Rock Paper Scissors_.\\nAnd then you want to be strategic about it.\\n\\n:::\\n\\nApart from the technical details of the puzzle, it went relatively smooth.\\n\\n### Solution\\n\\nI took relatively na\xefve approach and then tried to simplify it.\\n\\n## [Day 3: Rucksack Reorganization](https://adventofcode.com/2022/day/3)\\n\\n:::info tl;dr\\n\\nLet\'s go reorganize elves\' backpacks! Each backpacks has 2 compartments and you\\nwant to find the common item among those compartments. Each of them has priority,\\nyou care only about the sum.\\n\\n:::\\n\\nThis is the day where I started to fight the compiler and neither of us decided\\nto give up. Let\'s dive into it \\\\o/\\n\\n:::tip Fun fact\\n\\nFighting the compiler took me 30 minutes.\\n\\n:::\\n\\nWe need to find a common item among 2 collections, that\'s an easy task, right?\\nWe can construct 2 sets and find an intersection:\\n\\n```rust\\nlet top: HashSet = [1, 2, 3].iter().collect();\\nlet bottom: HashSet = [3, 4, 5].iter().collect();\\n```\\n\\nNow, the first issue that we encounter is caused by the fact that we are using\\na slice (the `[\u2026]`), iterator of that returns **references** to the numbers.\\nAnd we get immediately yelled at by the compiler, because the numbers are discarded\\nafter running the `.collect`. To fix this, we can use `.into_iter`:\\n\\n```rust\\nlet top: HashSet = [1, 2, 3].into_iter().collect();\\nlet bottom: HashSet = [3, 4, 5].into_iter().collect();\\n```\\n\\nThis way the numbers will get copied instead of referenced. OK, let\'s find the\\nintersection of those 2 collections:\\n\\n```rust\\nprintln!(\\"Common elements: {:?}\\", top.intersection(&bottom));\\n```\\n\\n```\\nCommon elements: [3]\\n```\\n\\n:::caution\\n\\nNotice that we need to do `&bottom`. It explicitly specifies that `.intersection`\\n**borrows** the `bottom`, i.e. takes an immutable reference to it.\\n\\n:::\\n\\nThat\'s what we want, right? Looks like it! \\\\o/\\n\\nNext part wants us to find the common element among all of the backpacks. OK, so\\nthat should be fairly easy, we have an intersection and we want to find intersection\\nover all of them.\\n\\nLet\'s have a look at the type of the `.intersection`\\n\\n```rust\\npub fn intersection<\'a>(\\n\xa0\xa0\xa0\xa0&\'a self,\\n\xa0\xa0\xa0\xa0other: &\'a HashSet\\n) -> Intersection<\'a, T, S>\\n```\\n\\nOK\u2026 Huh\u2026 But we have an example there!\\n\\n```rust\\nlet intersection: HashSet<_> = a.intersection(&b).collect();\\n```\\n\\nCool, that\'s all we need.\\n\\n```rust\\nlet top: HashSet = [1, 2, 3, 4].into_iter().collect();\\nlet bottom: HashSet = [3, 4, 5, 6].into_iter().collect();\\nlet top_2: HashSet = [2, 3, 4, 5, 6].into_iter().collect();\\nlet bottom_2: HashSet = [4, 5, 6].into_iter().collect();\\n\\nlet intersection: HashSet<_> = top.intersection(&bottom).collect();\\nprintln!(\\"Intersection: {:?}\\", intersection);\\n```\\n\\n```\\nIntersection: {3, 4}\\n```\\n\\nCool, so let\'s do the intersection with the `top_2`:\\n\\n```rust\\nlet top: HashSet = [1, 2, 3, 4].into_iter().collect();\\nlet bottom: HashSet = [3, 4, 5, 6].into_iter().collect();\\nlet top_2: HashSet = [2, 3, 4, 5, 6].into_iter().collect();\\nlet bottom_2: HashSet = [4, 5, 6].into_iter().collect();\\n\\nlet intersection: HashSet<_> = top.intersection(&bottom).collect();\\nlet intersection: HashSet<_> = intersection.intersection(&top_2).collect();\\nprintln!(\\"Intersection: {:?}\\", intersection);\\n```\\n\\nAnd we get yelled at by the compiler:\\n\\n```\\nerror[E0308]: mismatched types\\n --\x3e src/main.rs:10:58\\n |\\n10 | let intersection: HashSet<_> = intersection.intersection(&top_2).collect();\\n | ------------ ^^^^^^ expected `&i32`, found `i32`\\n | |\\n | arguments to this function are incorrect\\n |\\n = note: expected reference `&HashSet<&i32>`\\n found reference `&HashSet`\\n```\\n\\n/o\\\\ What the hell is going on here? Well, the funny thing is, that this operation\\ndoesn\'t return the elements themselves, but the references to them and when we pass\\nthe third set, it has just the values themselves, without any references.\\n\\n:::tip\\n\\nIt may seem as a very weird decision, but in fact it makes some sense\u2026 It allows\\nyou to do intersection of items that may not be possible to copy. Overall this is\\na \u201ctax\u201d for having a borrow checker ~~drilling your ass~~ having your back and\\nmaking sure you\'re not doing something naughty that may cause an **undefined**\\n**behavior**.\\n\\n:::\\n\\nTo resolve this we need to get an iterator that **clones** the elements:\\n\\n```rust\\nlet top: HashSet = [1, 2, 3, 4].into_iter().collect();\\nlet bottom: HashSet = [3, 4, 5, 6].into_iter().collect();\\nlet top_2: HashSet = [2, 3, 4, 5, 6].into_iter().collect();\\nlet bottom_2: HashSet = [4, 5, 6].into_iter().collect();\\n\\nlet intersection: HashSet<_> = top.intersection(&bottom).cloned().collect();\\nlet intersection: HashSet<_> = intersection.intersection(&top_2).cloned().collect();\\nlet intersection: HashSet<_> = intersection.intersection(&bottom_2).cloned().collect();\\nprintln!(\\"Intersection: {:?}\\", intersection);\\n```\\n\\n```\\nIntersection: {4}\\n```\\n\\n### Solution\\n\\nThe approach is pretty simple, if you omit the _1on1 with the compiler_. You just\\nhave some fun with the set operations :)\\n\\n## [Day 4: Camp Cleanup](https://adventofcode.com/2022/day/4)\\n\\n:::info tl;dr\\n\\nElves are cleaning up the camp and they got overlapping sections to clean up.\\nFind how many overlap and can take the day off.\\n\\n:::\\n\\n[`RangeInclusive`] is your friend not an enemy :)\\n\\n### Solution\\n\\nRelatively easy, you just need to parse the input and know what you want. Rust\'s\\n`RangeInclusive` type helped a lot, cause it took care of all abstractions.\\n\\n## [Day 5: Supply Stacks](https://adventofcode.com/2022/day/5)\\n\\n:::info tl;dr\\n\\nLet\'s play with stacks of crates.\\n\\n:::\\n\\nVery easy problem with very annoying input. You can judge yourself:\\n\\n```\\n [D]\\n[N] [C]\\n[Z] [M] [P]\\n 1 2 3\\n\\nmove 1 from 2 to 1\\nmove 3 from 1 to 3\\nmove 2 from 2 to 1\\nmove 1 from 1 to 2\\n```\\n\\nGood luck transforming that into something reasonable :)\\n\\n:::tip Fun fact\\n\\nTook me 40 minutes to parse this reasonably, including fighting the compiler.\\n\\n:::\\n\\n### Solution\\n\\nFor the initial solution I went with a manual solution (as in _I have done all_\\n_the work_. Later on I have decided to explore the `std` and interface of the\\n`std::vec::Vec` and found [`split_off`] which takes an index and splits (duh)\\nthe vector:\\n\\n```rust\\nlet mut vec = vec![1, 2, 3];\\nlet vec2 = vec.split_off(1);\\nassert_eq!(vec, [1]);\\nassert_eq!(vec2, [2, 3]);\\n```\\n\\nThis helped me simplify my solution a lot and also get rid of some _edge cases_.\\n\\n## [Day 6: Tuning Trouble](https://adventofcode.com/2022/day/6)\\n\\n:::info tl;dr\\n\\nFinding start of the message in a very weird protocol. Start of the message is\\ndenoted by $N$ unique consecutive characters.\\n\\n:::\\n\\n### Solution\\n\\nA lot of different approaches, knowing that we are dealing with input consisting\\nsolely of ASCII letters, I bit the bullet and went with sliding window and\\nconstructing sets from that window, checking if the set is as big as the window.\\n\\nOne possible optimization could consist of keeping a bit-vector (i.e. `usize`\\nvariable) of encountered characters and updating it as we go. However this has\\na different issue and that is removal of the characters from the left side of the\\nwindow. We don\'t know if the same character is not included later on.\\n\\nOther option is to do similar thing, but keeping the frequencies of the letters,\\nand again knowing we have only ASCII letters we can optimize by having a vector\\nof 26 elements that keeps count for each lowercase letter.\\n\\n## [Day 7: No Space Left On Device](https://adventofcode.com/2022/day/7)\\n\\n:::info tl;dr\\n\\nLet\'s simulate [`du`] to get some stats about our file system and then pinpoint\\ndirectories that take a lot of space and should be deleted.\\n\\n:::\\n\\n> I was waiting for this moment, and yet it got me!\\n> _imagine me swearing for hours_\\n\\n### Solution\\n\\nWe need to \u201c_build_\u201d a file system from the input that is given in a following form:\\n\\n```\\n$ cd /\\n$ ls\\ndir a\\n14848514 b.txt\\n8504156 c.dat\\ndir d\\n$ cd a\\n$ ls\\ndir e\\n29116 f\\n2557 g\\n62596 h.lst\\n$ cd e\\n$ ls\\n584 i\\n$ cd ..\\n$ cd ..\\n$ cd d\\n$ ls\\n4060174 j\\n8033020 d.log\\n5626152 d.ext\\n7214296 k\\n```\\n\\nThere are few ways in which you can achieve this and also you can assume some\\npreconditions, but why would we do that, right? :)\\n\\nYou can \u201cslap\u201d this in either [`HashMap`] or [`BTreeMap`] and call it a day.\\nAnd that would be boring\u2026\\n\\n:::tip\\n\\n`BTreeMap` is quite fitting for this, don\'t you think?\\n\\n:::\\n\\nI always wanted to try allocation on heap in Rust, so I chose to implement a tree.\\nI fought with the `Box` for some time and was losing\u2026\\n\\nThen I looked up some implementations of trees or linked lists and decided to try\\n`Rc>`. And I got my _ass whopped_ by the compiler once again. /o\\\\\\n\\n:::tip\\n\\n`Box` represents a dynamically allocated memory on heap. It is a single pointer,\\nyou can imagine this as `std::unique_ptr` in C++.\\n\\n`Rc` represents a dynamically allocated memory on heap. On top of that it is\\n_reference counted_ (that\'s what the `Rc` stands for). You can imagine this as\\n`std::shared_ptr` in C++.\\n\\nNow the fun stuff. Neither of them lets you **mutate** the contents of the memory.\\n\\n`Cell` allows you to mutate the memory. Can be used reasonably with types that\\ncan be copied, because the memory safety is guaranteed by copying the contents\\nwhen there is more than one **mutable** reference to the memory.\\n\\n`RefCell` is similar to the `Cell`, but the borrowing rules (how many mutable\\nreferences are present) are checked dynamically.\\n\\nSo in the end, if you want something like `std::shared_ptr` in Rust, you want\\nto have `Rc>`.\\n\\n:::\\n\\nSo, how are we going to represent the file system then? We will use an enumeration,\\nhehe, which is an algebraic data type that can store some stuff in itself :weary:\\n\\n```rust\\ntype FileHandle = Rc>;\\n\\n#[derive(Debug)]\\nenum AocFile {\\n File(usize),\\n Directory(BTreeMap),\\n}\\n```\\n\\nLet\'s go over it! `FileHandle` represents dynamically allocated `AocFile`, not\\nmuch to discuss. What does the `#[derive(Debug)]` do though? It lets us to print\\nout the value of that enumeration, it\'s derived, so it\'s not as good as if we had\\nimplemented it ourselves, but it\'s good enough for debugging, hence the name.\\n\\nNow to the fun part! `AocFile` value can be represented in two ways:\\n\\n- `File(usize)`, e.g. `AocFile::File(123)` and we can pattern match it, if we\\n need to\\n- `Directory(BTreeMap)` will represent the directory and will\\n contain map matching the name of the files (or directories) within to their\\n respective file handles\\n\\nI will omit the details about constructing this file system, cause there are a lot\\nof technicalities introduced by the nature of the input. However if you are\\ninterested, you can have a look at my solution.\\n\\nWe need to find small enough directories and also find the smallest directory that\\nwill free enough space. Now the question is, how could we do that. And there are\\nmultiple ways I will describe.\\n\\nI have chosen to implement [_tree catamorphism_] :weary:. It is basically a fold\\nover a tree data structure. We descent down into the leaves and propagate computed\\nresults all the way to the root. You can also notice that this approach is very\\nsimilar to _dynamic programming_, we find overlapping sections of the computation\\nand try to minimize the additional work (in this case: we need to know sizes of\\nour descendants, but we have already been there).\\n\\nAnother approach that has been suggested to me few days later is running DFS on\\nthe graph. And, funnily enough, we would still need to combine what we found in\\nthe branches where we descent. So in the end, it would work very similarly to my\\nsolution.\\n\\nOne of the more exotic options would be precomputing the required information at\\nthe same time as parsing. That could be done by adding additional fields to the\\nnodes which would allow storing such information and updating it as we construct\\nthe file system.\\n\\n## Post Mortem\\n\\nThings that have been brought up in the discussion later on.\\n\\n### `Rc` vs `Rc>`\\n\\nIt has been brought up that I have a contradicting statement regarding the\\ndynamically allocated memory. Specifically:\\n\\n- You can imagine `Rc` as an `std::shared_ptr` (in C++)\\n- When you want an equivalent of `std::shared_ptr`, you want to use\\n `Rc>`\\n\\nNow, in Rust it is a bit more complicated, because the type that represents the\\n\u201cshared pointer\u201d is `Rc`. What `RefCell` does is making sure that there is\\nonly one \u201cowner\u201d of a mutable reference at a time (and dynamically, as opposed\\nto the `Cell`).\\n\\nTherefore to be precise and correct about the equivalents of `std::shared_ptr`\\nin Rust, we can say that\\n\\n- `Rc` is an equivalent of a `const std::shared_ptr`,\\n- and `Rc>` is an equivalent of a `std::shared_ptr`.\\n\\nYou can easily see that they only differ in the mutability. (And even that is not\\nas simple as it seems, because there is also `Cell`)\\n\\n[_advent of code_]: https://adventofcode.com\\n[gitlab]: https://gitlab.com/mfocko/advent-of-code-2022\\n[`/src/bin/`]: https://gitlab.com/mfocko/advent-of-code-2022/-/tree/main/src/bin\\n[`sccache`]: https://github.com/mozilla/sccache\\n[`rangeinclusive`]: https://doc.rust-lang.org/std/ops/struct.RangeInclusive.html\\n[`split_off`]: https://doc.rust-lang.org/std/vec/struct.Vec.html#method.split_off\\n[`du`]: https://www.man7.org/linux/man-pages/man1/du.1.html\\n[`hashmap`]: https://doc.rust-lang.org/std/collections/struct.HashMap.html\\n[`btreemap`]: https://doc.rust-lang.org/std/collections/struct.BTreeMap.html\\n[_tree catamorphism_]: https://en.wikipedia.org/wiki/Catamorphism#Tree_fold"},{"id":"aoc-2022/intro","metadata":{"permalink":"/blog/aoc-2022/intro","editUrl":"https://github.com/mfocko/blog/tree/main/blog/aoc-2022/00-intro.md","source":"@site/blog/aoc-2022/00-intro.md","title":"Advent of Code \'22 in Rust","description":"Preparing for Advent of Code \'22.","date":"2022-12-14T21:45:00.000Z","formattedDate":"December 14, 2022","tags":[{"label":"advent-of-code","permalink":"/blog/tags/advent-of-code"},{"label":"advent-of-code-2022","permalink":"/blog/tags/advent-of-code-2022"},{"label":"rust","permalink":"/blog/tags/rust"}],"readingTime":8.665,"hasTruncateMarker":true,"authors":[{"name":"Matej Focko","email":"me+blog@mfocko.xyz","title":"a.k.a. @mf","url":"https://gitlab.com/mfocko","imageURL":"https://github.com/mfocko.png","key":"mf"}],"frontMatter":{"title":"Advent of Code \'22 in Rust","description":"Preparing for Advent of Code \'22.","date":"2022-12-14T21:45","slug":"aoc-2022/intro","authors":"mf","tags":["advent-of-code","advent-of-code-2022","rust"],"hide_table_of_contents":false},"unlisted":false,"prevItem":{"title":"1st week of Advent of Code \'22 in Rust","permalink":"/blog/aoc-2022/1st-week"}},"content":"Let\'s talk about the preparations for this year\'s [_Advent of Code_].\\n\\n\x3c!--truncate--\x3e\\n\\n## Choosing a language\\n\\nWhen choosing a language for AoC, you usually want a language that gives you a\\nquick feedback which allows you to iterate quickly to the solution of the puzzle.\\nOne of the most common choices is Python, many people also use JavaScript or Ruby.\\n\\nGiven the competitive nature of the AoC and popularity among competitive programming,\\nC++ might be also a very good choice. Only if you are familiar with it, I guess\u2026\\n\\nIf you want a challenge, you might also choose to rotate the languages each day.\\nThough I prefer to use only one language.\\n\\nFor this year I have been deciding between _Rust_, _C++_ and _Pascal_ or _Ada_.\\n\\nI have tried Rust last year and have survived with it for 3 days and then gave\\nup and switched to _Kotlin_, which was pretty good given it is \u201cJava undercover\u201d.\\nI pretty much like the ideas behind Rust, I am not sure about the whole cult and\\nimplementation of those ideas though. After some years with C/C++, I would say\\nthat Rust feels _too safe_ for my taste and tries to \u201c_punish me_\u201d even for the\\nmost trivial things.\\n\\nC++ is a very robust, but also comes with a wide variety of options providing you\\nthe ability to shoot yourself in the leg. I have tried to solve few days of previous\\nAdvent of Code events, it was _relatively easy_ to solve the problems in C++, given\\nthat I do not admit writing my own iterator for `enumerate`\u2026\\n\\nPascal or Ada were meme choices :) Ada is heavily inspired by Pascal and has a\\npretty nice standard library that offers enough to be able to quickly solve some\\nproblems in it. However the toolkit is questionable :/\\n\\n## Choosing libraries\\n\\n## Preparations for Rust\\n\\nAll of the sources, later on including solutions, can be found at my\\n[GitLab].\\n\\n### Toolkit\\n\\nSince we are using Rust, we are going to use a [Cargo] and more than likely VSCode\\nwith [`rust-analyzer`]. Because of my choice of libraries, we will also introduce\\na `.envrc` file that can be used by [`direnv`], which allows you to set specific\\nenvironment variables when you enter a directory. In our case, we will use\\n\\n```bash\\n# to show nice backtrace when using the color-eyre\\nexport RUST_BACKTRACE=1\\n\\n# to catch logs generated by tracing\\nexport RUST_LOG=trace\\n```\\n\\nAnd for the one of the most obnoxious things ever, we will use a script to download\\nthe inputs instead of \u201c_clicking, opening and copying to a file_\u201d[^1]. There is\\nno need to be _fancy_, so we will adjust Python script by Martin[^2].\\n\\n```py\\n#!/usr/bin/env python3\\n\\nimport datetime\\nimport yaml\\nimport requests\\nimport sys\\n\\n\\ndef load_config():\\n with open(\\"env.yaml\\", \\"r\\") as f:\\n js = yaml.load(f, Loader=yaml.Loader)\\n return js[\\"session\\"], js[\\"year\\"]\\n\\n\\ndef get_input(session, year, day):\\n return requests.get(\\n f\\"https://adventofcode.com/{year}/day/{day}/input\\",\\n cookies={\\"session\\": session},\\n headers={\\n \\"User-Agent\\": \\"{repo} by {mail}\\".format(\\n repo=\\"gitlab.com/mfocko/advent-of-code-2022\\",\\n mail=\\"me@mfocko.xyz\\",\\n )\\n },\\n ).content.decode(\\"utf-8\\")\\n\\n\\ndef main():\\n day = datetime.datetime.now().day\\n if len(sys.argv) == 2:\\n day = sys.argv[1]\\n\\n session, year = load_config()\\n problem_input = get_input(session, year, day)\\n\\n with open(f\\"./inputs/day{day:>02}.txt\\", \\"w\\") as f:\\n f.write(problem_input)\\n\\n\\nif __name__ == \\"__main__\\":\\n main()\\n```\\n\\nIf the script is called without any arguments, it will deduce the day from the\\nsystem, so we do not need to change the day every morning. It also requires a\\nconfiguration file:\\n\\n```yaml\\n# env.yaml\\nsession: \u2039your session cookie\u203a\\nyear: 2022\\n```\\n\\n### Libraries\\n\\nLooking at the list of the libraries, I have chosen \u201ca lot\u201d of them. Let\'s walk\\nthrough each of them.\\n\\n[`tracing`] and [`tracing-subscriber`] are the crates that can be used for tracing\\nand logging of your Rust programs, there are also other crates that can help you\\nwith providing backtrace to the Sentry in case you have deployed your application\\nsomewhere and you want to watch over it. In our use case we will just utilize the\\nmacros for debugging in the terminal.\\n\\n[`thiserror`], [`anyhow`] and [`color-eyre`] are used for error reporting.\\n`thiserror` is a very good choice for libraries, cause it extends the `Error`\\nfrom the `std` and allows you to create more convenient error types. Next is\\n`anyhow` which kinda builds on top of the `thiserror` and provides you with simpler\\nerror handling in binaries[^3]. And finally we have `color-eyre` which, as I found\\nout later, is a colorful (_wink wink_) extension of `eyre` which is fork of `anyhow`\\nwhile supporting customized reports.\\n\\nIn the end I have decided to remove `thiserror` and `anyhow`, since first one is\\nsuitable for libraries and the latter was basically fully replaced by `{color-,}eyre`.\\n\\n[`regex`] and [`lazy_static`] are a very good and also, I hope, self-explanatory\\ncombination. `lazy_static` allows you to have static variables that must be initialized\\nduring runtime.\\n\\n[`itertools`] provides some nice extensions to the iterators from the `std`.\\n\\n### My own \u201clibrary\u201d\\n\\nWhen creating the crate for this year\'s Advent of Code, I have chosen a library\\ntype. Even though standard library is huge, some things might not be included and\\nalso we can follow _KISS_. I have 2 modules that my \u201clibrary\u201d exports, one for\\nparsing and one for 2D vector (that gets used quite often during Advent of Code).\\n\\nKey part is, of course, processing the input and my library exports following\\nfunctions that get used a lot:\\n\\n```rust\\n/// Reads file to the string.\\npub fn file_to_string>(pathname: P) -> String;\\n\\n/// Reads file and returns it as a vector of characters.\\npub fn file_to_chars>(pathname: P) -> Vec;\\n\\n/// Reads file and returns a vector of parsed structures. Expects each structure\\n/// on its own line in the file. And `T` needs to implement `FromStr` trait.\\npub fn file_to_structs, T: FromStr>(pathname: P) -> Vec\\nwhere\\n ::Err: Debug;\\n\\n/// Converts iterator over strings to a vector of parsed structures. `T` needs\\n/// to implement `FromStr` trait and its error must derive `Debug` trait.\\npub fn strings_to_structs(\\n iter: impl Iterator\\n) -> Vec\\nwhere\\n ::Err: std::fmt::Debug,\\n U: Deref;\\n\\n/// Reads file and returns it as a vector of its lines.\\npub fn file_to_lines>(pathname: P) -> Vec;\\n```\\n\\nAs for the vector, I went with a rather simple implementation that allows only\\naddition of the vectors for now and accessing the elements via functions `x()`\\nand `y()`. Also the vector is generic, so we can use it with any numeric type we\\nneed.\\n\\n### Skeleton\\n\\nWe can also prepare a template to quickly bootstrap each of the days. We know\\nthat each puzzle has 2 parts, which means that we can start with 2 functions that\\nwill solve them.\\n\\n```rust\\nfn part1(input: &Input) -> Output {\\n todo!()\\n}\\n\\nfn part2(input: &Input) -> Output {\\n todo!()\\n}\\n```\\n\\nBoth functions take reference to the input and return some output (in majority\\nof puzzles, it is the same type). `todo!()` can be used as a nice placeholder,\\nit also causes a panic when reached and we could also provide some string with\\nan explanation, e.g. `todo!(\\"part 1\\")`. We have not given functions a specific\\ntype and to avoid as much copy-paste as possible, we will introduce type aliases.\\n\\n```rust\\ntype Input = String;\\ntype Output = i32;\\n```\\n\\n:::tip\\n\\nThis allows us to quickly adjust the types only in one place without the need to\\ndo _regex-replace_ or replace them manually.\\n\\n:::\\n\\nFor each day we get a personalized input that is provided as a text file. Almost\\nall the time, we would like to get some structured type out of that input, and\\ntherefore it makes sense to introduce a new function that will provide the parsing\\nof the input.\\n\\n```rust\\nfn parse_input(path: &str) -> Input {\\n todo!()\\n}\\n```\\n\\nThis \u201cparser\u201d will take a path to the file, just in case we would like to run the\\nsample instead of input.\\n\\nOK, so now we can write a `main` function that will take all of the pieces and\\nrun them.\\n\\n```rust\\nfn main() {\\n let input = parse_input(\\"inputs/dayXX.txt\\");\\n\\n println!(\\"Part 1: {}\\", part_1(&input));\\n println!(\\"Part 2: {}\\", part_2(&input));\\n}\\n```\\n\\nThis would definitely do :) But we have installed a few libraries and we want to\\nuse them. In this part we are going to utilize _[`tracing`]_ (for tracing, duh\u2026)\\nand _[`color-eyre`]_ (for better error reporting, e.g. from parsing).\\n\\n```rust\\nfn main() -> Result<()> {\\n tracing_subscriber::fmt()\\n .with_env_filter(EnvFilter::from_default_env())\\n .with_target(false)\\n .with_file(true)\\n .with_line_number(true)\\n .without_time()\\n .compact()\\n .init();\\n color_eyre::install()?;\\n\\n let input = parse_input(\\"inputs/dayXX.txt\\");\\n\\n info!(\\"Part 1: {}\\", part_1(&input));\\n info!(\\"Part 2: {}\\", part_2(&input));\\n\\n Ok(())\\n}\\n```\\n\\nThe first statement will set up tracing and configure it to print out the logs to\\nterminal, based on the environment variable. We also change the formatting a bit,\\nsince we do not need all the _fancy_ features of the logger. Pure initialization\\nwould get us logs like this:\\n\\n```\\n2022-12-11T19:53:19.975343Z INFO day01: Part 1: 0\\n```\\n\\nHowever after running that command, we will get the following:\\n\\n```\\n INFO src/bin/day01.rs:35: Part 1: 0\\n```\\n\\nAnd the `color_eyre::install()?` is quite straightforward. We just initialize the\\nerror reporting by _color eyre_.\\n\\n:::caution\\n\\nNotice that we had to add `Ok(())` to the end of the function and adjust the\\nreturn type of the `main` to `Result<()>`. It is caused by the _color eyre_ that\\ncan be installed only once and therefore it can fail, that is how we got the `?`\\nat the end of the `::install` which _unwraps_ the **\xbbresult\xab** of the installation.\\n\\n:::\\n\\nOverall we will get to a template like this:\\n\\n```rust\\nuse aoc_2022::*;\\n\\nuse color_eyre::eyre::Result;\\nuse tracing::info;\\nuse tracing_subscriber::EnvFilter;\\n\\ntype Input = String;\\ntype Output = i32;\\n\\nfn parse_input(path: &str) -> Input {\\n todo!()\\n}\\n\\nfn part1(input: &Input) -> Output {\\n todo!()\\n}\\n\\nfn part2(input: &Input) -> Output {\\n todo!()\\n}\\n\\nfn main() -> Result<()> {\\n tracing_subscriber::fmt()\\n .with_env_filter(EnvFilter::from_default_env())\\n .with_target(false)\\n .with_file(true)\\n .with_line_number(true)\\n .without_time()\\n .compact()\\n .init();\\n color_eyre::install()?;\\n\\n let input = parse_input(\\"inputs/dayXX.txt\\");\\n\\n info!(\\"Part 1: {}\\", part_1(&input));\\n info!(\\"Part 2: {}\\", part_2(&input));\\n\\n Ok(())\\n}\\n```\\n\\n[^1]:\\n Copy-pasting might be a relaxing thing to do, but you can also discover\\n nasty stuff about your PC. See [this Reddit post and the comment].\\n\\n[^2]: [GitHub profile](https://github.com/martinjonas)\\n[^3]:\\n Even though you can use it even for libraries, but handling errors from\\n libraries using `anyhow` is nasty\u2026 You will be the stinky one ;)\\n\\n[_advent of code_]: https://adventofcode.com\\n[gitlab]: https://gitlab.com/mfocko/advent-of-code-2022\\n[cargo]: https://doc.rust-lang.org/cargo/\\n[`rust-analyzer`]: https://rust-analyzer.github.io/\\n[`direnv`]: https://direnv.net/\\n[`tracing`]: https://crates.io/crates/tracing\\n[`tracing-subscriber`]: https://crates.io/crates/tracing-subscriber\\n[`thiserror`]: https://crates.io/crates/thiserror\\n[`anyhow`]: https://crates.io/crates/anyhow\\n[`color-eyre`]: https://crates.io/crates/color-eyre\\n[`regex`]: https://crates.io/crates/regex\\n[`lazy_static`]: https://crates.io/crates/lazy_static\\n[`itertools`]: https://crates.io/crates/itertools\\n[this reddit post and the comment]: https://www.reddit.com/r/adventofcode/comments/zb98pn/comment/iyq0ono"}]}')}}]); \ No newline at end of file diff --git a/assets/js/4200b1a9.5219df10.js b/assets/js/4200b1a9.5219df10.js new file mode 100644 index 0000000..7934eb5 --- /dev/null +++ b/assets/js/4200b1a9.5219df10.js @@ -0,0 +1 @@ +"use strict";(self.webpackChunkfi=self.webpackChunkfi||[]).push([[866],{24612:e=>{e.exports=JSON.parse('{"blogPosts":[{"id":"/2024/06/19/devconf-2024","metadata":{"permalink":"/blog/2024/06/19/devconf-2024","editUrl":"https://github.com/mfocko/blog/tree/main/blog/2024-06-19-devconf-2024.md","source":"@site/blog/2024-06-19-devconf-2024.md","title":"DevConf.cz 2024","description":"Sharing my experience on DevConf.cz 2024.\\n","date":"2024-06-19T00:00:00.000Z","formattedDate":"June 19, 2024","tags":[{"label":"\ud83c\udfed","permalink":"/blog/tags/\ud83c\udfed"},{"label":"red-hat","permalink":"/blog/tags/red-hat"},{"label":"fedora","permalink":"/blog/tags/fedora"},{"label":"devconf","permalink":"/blog/tags/devconf"},{"label":"conferences","permalink":"/blog/tags/conferences"}],"readingTime":5.36,"hasTruncateMarker":true,"authors":[{"name":"Matej Focko","email":"me+blog@mfocko.xyz","title":"a.k.a. exhausted DevConf attendee","url":"https://gitlab.com/mfocko","imageURL":"https://github.com/mfocko.png","key":"mf"}],"frontMatter":{"title":"DevConf.cz 2024","description":"Sharing my experience on DevConf.cz 2024.\\n","date":"2024-06-19T00:00:00.000Z","authors":[{"key":"mf","title":"a.k.a. exhausted DevConf attendee"}],"tags":["\ud83c\udfed","red-hat","fedora","devconf","conferences"]},"unlisted":false,"nextItem":{"title":"LTS distributions","permalink":"/blog/2024/02/07/lts-distros"}},"content":"I\'d like to share my experience and views on some of the talks that I\'ve\\nattended on the DevConf.cz 2024.\\n\\n\x3c!--truncate--\x3e\\n\\n## Day 1\\n\\nLet\'s start with the first day which was Thursday this year as opposed to the\\nprevious years when the conference started on Friday and finished on Sunday.\\n\\nLet\'s start with the _[keynote]_. The keynote wasn\'t very interesting, at some of\\nthe slides actually felt like advertisement for other talks on the topic of the\\nAI\u2026\\n\\nNext talk about _[event-driven Ansible]_ was way more interesting. It allows you\\nto run Ansible playbooks after provisioning hosts, or on certain events, such as\\ndiscovered vulnerabilities. On one hand it feels like a very nice thing, but on\\nthe other one I can\'t help but to think how you need to write the playbooks, so\\nthat they are generic enough. One more example that\'s been given comes from the\\npossibility to react to tickets, e.g., outages and this feels like something\\nthat could be abused to cause DoS.\\n\\nAfterwards we\'ve seen two lightning talks, one about\\n_[choosing the right OpenShift size]_ which was a pretty quick, but listed all\\nof the possible ways you can deploy OpenShift in detail. This lightning talk\\nwas followed by the first AI (lightning) talk I\'ve attended about\\n_[rapid prototyping]_ of the open-source AI models.\\n\\nAs someone who\'s involved in the automation of the RPM packaging and testing, of\\ncourse, we had to attend _[Learning from Nix]_. Nix has a very intriguing\\nconcept which is pretty powerful, but painful at the same time. This can be\\nsummed up pretty nicely by [Tsoding] who got asked about some tips & tricks for\\nsomeone who wants to try out NixOS:\\n\\n> Just don\'t.\\n\\nAnd now we\'re moving into a section where everything revolves about the Packit\\nTeam :)\\n\\nFirst talk about _[changelogs]_ was an interactive session that was (probably)\\nmeant to share different approaches we take to handle this rather convoluted\\ntopic that involves changelogs on both upstream and also on downstream with no\\nrules[^1].\\n\\n![changelogs](https://i.imgur.com/YHstMAt.jpg)\\n\\nNext one was about _[static analysis]_ done by [OpenScanHub]. I like the idea of\\nrunning the static analysis that can uncover nasty bugs (as it has been even\\nshowed in the talk) at the same time as they are introduced. I gotta admit that\\nafter seeing the UI of the [deployed OpenScanHub] on the Fedora Infra, I couldn\'t\\nhelp but to think about the [United States Graphics Company] :smile: The UI is\\nto the point, no fancy annoying shit, you get what you need, it\'s hard to get\\nlost. **Just simplicity.** Best kind of UI/UX in my opinion.\\n\\nAfter the OpenScanHub talk we\'re getting to talks that were taken in a totally\\ndifferent direction from the usual talks you\'re used to :wink: First one was\\ngiven title of _[\u201cIndiana Jones and obsoleted projects\u201d]_ by [Mirek]. He talked\\nabout projects that got obsoleted, but started with projects that had no\\nrelation to IT field at all. I\'d mark this talk as a _stand up_ without any\\nhesitation.\\n\\nAnd finally we will wrap up the first day with the talk where speakers spoke the\\nleast\u2026 _[\u201cLet the users speak!\u201d]_ that involved users of both Packit and\\nTesting Farm who spoke about their use case and benefits they gained from using\\nboth services in a symbiosis.\\n\\n## Day 2\\n\\nOn the second day I\'ve attended less talks to not burn myself out :) I\'ve\\nstarted with an AI-related talk with title _[\u201cAI: Open source will save us!\u201d]_,\\neven though this talk has been improvised, as the speakers from the schedule\\ncouldn\'t have attended, it provided a nice overview what [InstructLab] can do\\nand how can you \u201cfeed\u201d the relevant info into the language models by yourself.\\n\\nAfter that I attended a _\u201ccoffee enthusiasts Meetup\u201d_ which was very nice and,\\nof course, an organized chaos :wink:\\n\\nBefore attending the social event I wrapped up the second day with a lightning\\ntalk about _[recent updates in Toolbx]_. I\'ve used both [toolbx] and\\n[distrobox], so it\'s nice to see the improvements in progress and also that both\\nprojects are well and lively.\\n\\n## Day 3\\n\\nOn the third day I\'ve attended only in the afternoon. \u201cStarted\u201d my day with\\na discussion _[\u201cLeadership: Where people skills meet programmers\u201d]_ which was\\nvery nice for gaining an insight into how developer, manager and QE lead roles\\noverlap.\\n\\nThat talk has been followed up by a talk about [role rotation] in our Packit\\nTeam. I would say it is a nice \u201cupgrade\u201d to the agile process which allows you\\nto not create a single point of failure in the mundane and repetitive processes\\nwithin your team.\\n\\nAnd this day has been finished off with a talk about [shifting left] in Podman.\\nIt\'s nice to see how other teams utilize our Packit Service and also the\\nservices we rely on, such as [Copr] or [Testing Farm]. With the help of Cockpit\\ntests they can catch breaking changes early on, or even bugs that have been\\nintroduced and break usage of the dependent projects.\\n\\n![shifting left](https://i.imgur.com/bp6FxT9.jpg)\\n\\n## Picks from the Packit Team\\n\\nOn the Tuesday, during our Packit stand up, I have managed to abuse my\\nKanban Lead role to collect some of the talks that each of us would recommend:\\n\\n- [Rapid Prototyping] with Open Source AI Models\\n- Do you like your [changelogs]?\\n- OpenScanHub - [Static Analysis] of a Linux Distribution\\n- Creating a [Language Server] for RPM Spec Files\\n- Containers and Kubernetes Made Easy: A 15-minute dive into [Podman Desktop]\\n- [\u201cLeadership: Where people skills meet programmers\u201d]\\n\\n## Wrap up\\n\\nI have to admit that these 3 days have been pretty exhaustive, including\\ninformation overload :smile: but at the same time it was really nice to meet\\nwith the colleagues and at least some of our users who are not based in Brno.\\n\\n[^1]: except for the Fedora\'s downstream ;)\\n\\n[keynote]: https://pretalx.com/devconf-cz-2024/talk/AD3HWR/\\n[event-driven ansible]: https://pretalx.com/devconf-cz-2024/talk/3UKGLB/\\n[choosing the right openshift size]: https://pretalx.com/devconf-cz-2024/talk/KSDRWL/\\n[rapid prototyping]: https://pretalx.com/devconf-cz-2024/talk/H9QFLM/\\n[learning from nix]: https://pretalx.com/devconf-cz-2024/talk/NNKT3F/\\n[tsoding]: https://twitch.tv/tsoding\\n[changelogs]: https://pretalx.com/devconf-cz-2024/talk/ECU7QS/\\n[static analysis]: https://pretalx.com/devconf-cz-2024/talk/7C38GJ/\\n[openscanhub]: https://openscanhub.dev/\\n[deployed openscanhub]: https://openscanhub.fedoraproject.org/\\n[united states graphics company]: https://x.com/usgraphics\\n[\u201cindiana jones and obsoleted projects\u201d]: https://pretalx.com/devconf-cz-2024/talk/X8SYDG/\\n[mirek]: https://rodina-sucha.cz/@mirek\\n[\u201clet the users speak!\u201d]: https://pretalx.com/devconf-cz-2024/talk/BDMWF3/\\n[\u201cai: open source will save us!\u201d]: https://pretalx.com/devconf-cz-2024/talk/QSF9QQ/\\n[instructlab]: https://github.com/instructlab/instructlab\\n[recent updates in toolbx]: https://pretalx.com/devconf-cz-2024/talk/SXWE7K/\\n[toolbx]: https://containertoolbx.org/\\n[distrobox]: https://distrobox.it/\\n[\u201cleadership: where people skills meet programmers\u201d]: https://pretalx.com/devconf-cz-2024/talk/8PARM8/\\n[role rotation]: https://pretalx.com/devconf-cz-2024/talk/8T88MT/\\n[shifting left]: https://pretalx.com/devconf-cz-2024/talk/WVNJZS/\\n[copr]: https://copr.fedorainfracloud.org/\\n[testing farm]: https://docs.testing-farm.io/Testing%20Farm/0.1/index.html\\n[language server]: https://pretalx.com/devconf-cz-2024/talk/RXKMKA/\\n[podman desktop]: https://pretalx.com/devconf-cz-2024/talk/HKWP7V/"},{"id":"/2024/02/07/lts-distros","metadata":{"permalink":"/blog/2024/02/07/lts-distros","editUrl":"https://github.com/mfocko/blog/tree/main/blog/2024-02-07-lts-distros.md","source":"@site/blog/2024-02-07-lts-distros.md","title":"LTS distributions","description":"Shower thoughts on the LTS Linux distributions.\\n","date":"2024-02-07T00:00:00.000Z","formattedDate":"February 7, 2024","tags":[{"label":"lts","permalink":"/blog/tags/lts"},{"label":"linux distributions","permalink":"/blog/tags/linux-distributions"},{"label":"support","permalink":"/blog/tags/support"},{"label":"paywall","permalink":"/blog/tags/paywall"}],"readingTime":14.515,"hasTruncateMarker":true,"authors":[{"name":"Matej Focko","email":"me+blog@mfocko.xyz","title":"a.k.a. small Fedora maintainer","url":"https://gitlab.com/mfocko","imageURL":"https://github.com/mfocko.png","key":"mf"}],"frontMatter":{"title":"LTS distributions","description":"Shower thoughts on the LTS Linux distributions.\\n","date":"2024-02-07T00:00:00.000Z","authors":[{"key":"mf","title":"a.k.a. small Fedora maintainer"}],"tags":["lts","linux distributions","support","paywall"],"hide_table_of_contents":false},"unlisted":false,"prevItem":{"title":"DevConf.cz 2024","permalink":"/blog/2024/06/19/devconf-2024"},"nextItem":{"title":"Mixed feelings on Rust","permalink":"/blog/2024/01/28/rust-opinion"}},"content":"Linux distributions are a common choice for running the servers. There\'s a wide\\nvariety of distributions, but on the servers majority is made by only a few.\\n\\nSome corporations also profit from the support of the \u201cbig\u201d distributions. Let\'s\\ndive into the pros, cons and peculiarities of such _business_.\\n\\nThis post is inspired/triggered by the following Mastodon post:\\n[![Mastodon post about Ubuntu Pro](https://i.imgur.com/mh5RAlV.png)](https://hackers.town/@antijingoist/111864760073049505)\\n\\n\x3c!--truncate--\x3e\\n\\n:::caution Disclaimer\\n\\nYou may take my opinion with a grain of salt, since I\'m affiliated with Red Hat,\\nbut at the same time I\'ve also seen the other side of the fence, so I know how\\nit works from the perspective of the provider/maintainer.\\n\\n:::\\n\\n:::tip\\n\\nIf you are not very oriented in the matters of Linux distributions and\\nmaintaining of packages, I suggest looking at the [glossary](#glossary) at the\\nend to have a better grasp of the terms that are used throughout the post.\\n\\n:::\\n\\n## Point of linux distributions\\n\\nFirst thing I\'d like to point out is the point of the Linux distributions. What\\nbenefit do they provide? And why there are so many of them\u2026\\n\\nAs it has been brought up many times by the _rms_[^1], Linux by itself is not\\nenough, it\'s just the kernel that does the underlying work. We need more\\nsoftware to utilize the hardware. That\'s the gap that Linux distributions bridge\\nby providing the Linux and much more other software that we need.\\n\\nEach distribution is unique in its own way. Some prefer different ways of\\nhandling the software (like Gentoo that allows you to compile it yourself) and\\nothers stable releases of software (like Debian).\\n\\nIn the end it mostly boils down to the packaging. I, as a user, want to do\\nsomething like\\n\\n```\\n$ sudo dnf5 install firefox\\n```\\n\\nand not bother about anything else. I don\'t want to open browser to look the\\nthing up, download it and then click mindlessly 500\xd7 \u201cNext\u201d. I just want to run\\none command and when the maintainers decide it\'s time to move on, another one to\\nupgrade the software to the newer version.\\n\\nOf course, for some use cases you want to minimize the latter. And even make\\nsure that it\'s safe to do it when you need to. You don\'t want to break your\\nproduction deployment just because someone decided it\'s time to push something\\nout.\\n\\nThat\'s when the _maintainers_ come in. They take upon themselves the\\nresponsibility of maintaining the packages. If you\'ve ever used the Debian, you\\nknow very well how _old_ the software is, but that\'s what you might need for\\nyour servers.\\n\\n## Pain of packaging\\n\\nPackaging software _is not_ cost-free. You may as well have 80 % of packages\\nthat don\'t need much care and it\'s rather easy to push them forward, but those\\nremaining, which are complicated and raise issues regularly, will make it up and\\ntake a lot of time and also pain.\\n\\nLibraries are the most common example that might not need much work to be done.\\nOn the other hand, Linux kernel itself is a rather complicated machinery that\\nis patched **a lot** and its build process is not simple either.\\n\\nEven if you consider just those _easily-maintainble_ packages, the process can\\nbe tedious, boring and overall time consuming.\\n\\n:::tip Shameless RHEL-based ecosystem plug\\n\\n[Packit] can help tremendously with the _easily-maintainable_ packages, since it\\n**can** be automated.\\n\\n:::\\n\\n### Packaging whole ecosystems\\n\\nNow it\'s time to talk about whole ecosystems that have some kind of a packaging\\nby themselves. Yes, I mean Python (with its continuous stream of different\\npackage managers), Rust, Go, etc.\\n\\nWhole point of packaging is to have some form of _gating_. In other words, you\\nwant some kind of _quality control_ when pushing changes into the Linux distros.\\n\\nIf you want to package some tool (or even library) from the aforementioned\\necosystems, you need to package all of the dependencies to make sure something\\ndoesn\'t get updated in the meantime (and also that you can safely reproduce the\\nbuilds, if need be).\\n\\nI\'ve tried to package some utilities for EPEL both in Rust and Go. Dependencies\\nform a DAG[^2] and in case of Rust, it\'s _very_ similar to the way `npm` does\\nits packaging.\\n\\n:::danger Spoiler alert\\n\\nYou get a lot of dependencies. And since it\'s a tree of dependencies, there may\\nbe **a lot** of them.\\n\\n:::\\n\\nI have no clue how do the Rust maintainers operate, but I\'m tipping my fedora in\\ntheir direction, since it must be a _pain in the ass_.\\n\\n## Paid distributions\\n\\nYou can find few Linux distributions that are \u201cpaid\u201d. I\'m very well aware of the\\nfact I\'ve used quotes around the word, cause it\'s not that easy and not even\\nsame for all of the distributions that involve some kind of a payment.\\n\\nOne of the first non-free distributions I\'ve come into contact was _[Zorin OS]_\\nwhich basically tries to be the best _transition_ solution when moving away from\\nthe Windows or macOS. If you have a look at the _perks_ of its _Pro_ version\\nthat\'s paid, you may as well decide they are rather questionable\u2026\\n\\nIt\'s time to move into the _Ubuntu Pro_, _RHEL_ and _SLE_ territory. What\'s the\\npoint of those? They definitely offer different kind of, let\'s say,\\n_non-free experience_.\\n\\nWith those you are paying mainly for the support and bug/security patches.\\n\\n:::tip Fun fact\\n\\nThere\'s no mention of any kind of support on the Zorin page\u2026 Apart from the fact\\nthat _you are supporting_ the Zorin development.\\n\\n:::\\n\\n## Repository structure\\n\\nAs I have mentioned above, the three _services_[^3] I mentioned are providing\\nsupport with regards to bugs and security vulnerabilites. Therefore it makes\\nsense to have some kind of a process in place when you\'re pushing changes\\n(either updates, patches or _security_ patches) to the distribution. And yes,\\nthese processes are _in place_.\\n\\nIf you think about the amount of packages that is present in the community\\ndistributions like _archLinux_ (14,830 packages) or _Fedora_ (74,309 packages),\\nit is safe to come to a conclusion that _there\'s no way_ to support all of them.\\n\\n:::tip archLinux\\n\\nIt may seem that archLinux contains rather small set of packages, but one of the\\n_killer features_ of archLinux lies in the AUR (archLinux User Repository) where\\nyou can find additional **93,283** packages.\\n\\n:::\\n\\nThat\'s why the Linux distributions have some structure to their repositories\\nthat contain packages. The way you go around this is rather simple, you choose\\nsome set of _critical_ packages that you guarantee support for (like Linux\\nkernel, openSSL, etc.) and maintain those with all the QA processes in place.\\n\\n:::caution Unpopular opinion\\n\\nThis is also one of the reasons why I\'m quite against packaging anything and\\neverything into the Linux distribution. In my opinion it is impossible to\\n**properly** maintain **huge** set of packages and enforce some kind of\\n**quality control**.\\n\\n:::\\n\\n### Ubuntu\\n\\nUbuntu has pretty granular structure of their repositories, namely:\\n\\n- `main` containing the \u201ccore\u201d of the Ubuntu that is maintained by the Canonical,\\n- `universe` containing literally the \u201cuniverse\u201d, packages that everyone likes,\\n but they\'re not crucial, this repo is maintained mostly by the community,\\n- `multiverse` containing packages with some license or copyright issues, and\\n- `restricted` containing _proprietary_ packages like nvidia drivers and such.\\n\\nBy briefly checking my Ubuntu 23.10 installation, here are stats of packages in\\ntheir respective repositories:\\n\\n- `main` with 6,128 packages,\\n- `universe` with 63,380 packages,\\n- `multiverse` with 997 packages, and finally\\n- `restricted` with 784 packages.\\n\\nAs you can see, if we sum them up, they are relatively similar to the Fedora\\nnumbers.\\n\\n### CentOS\\n\\nCentOS on the other hand has a bit simpler structure with BaseOS for the base\\nand AppStream for additional packages:\\n\\n- `baseos` with 1,058 packages,\\n- `appstream` with 5,646 packages, and\\n- `extras-common` with 42 packages.\\n\\nOverall they make up the similar number as the Ubuntu\'s `main` repository. And\\nyou can also notice that there are no additional repositories.\\n\\n:::tip\\n\\nThere\'s also a CRB (CodeReady Builder) repository with dev packages like headers\\nand such.\\n\\nAnd you can also enable EPEL (Extra Packages for Enterprise Linux) which is\\ncommunity-supported and provides another 19,903 packages.\\n\\n:::\\n\\n## Ubuntu Pro\\n\\nNow it\'s time to get back to the Ubuntu Pro. There are multiple points that need\\nto be taken in account to be either positive or negative about it\u2026\\n\\nWe can start with the way Ubuntu is released and maintained. Ubuntu has regular\\n6-month release cycle and biannual LTS release. Releases are normally supported\\nfor 9 months with the exception of the LTS releases being supported for 5 years.\\n\\nIf you check out the _[Ubuntu Pro]_ website, you can find the following\\nstatement:\\n\\n> **Ubuntu Pro**\\n>\\n> The most comprehensive subscription for open-source software security\\n>\\n> 30-day trial for enterprises. Always free for personal use.\\n\\n:::tip Personal use\\n\\nUbuntu Pro for _personal use_ consists of 5 installations and in case of the\\ncommunity _ambassadors_ 50.\\n\\n:::\\n\\nOverall if you try to find what is included in the Ubuntu Pro:\\n\\n- high and critical patches,\\n- 10 years of maintenance, and\\n- (optional) 24/7 enterprise-grade support.\\n\\nIf we get back to the screenshot all the way at the beginning of the post:\\n[![Mastodon post about Ubuntu Pro](https://i.imgur.com/mh5RAlV.png)](https://hackers.town/@antijingoist/111864760073049505)\\n\\nand try to look up to which repository the packages mentioned in the screenshot\\nbelong, we will find out that they belong to `universe` repository which is\\nmaintained by the community. Not to mention nature of the packages: multimedia.\\n\\nYou may think about this as a scam, but considering repository consisting of 70k\\npackages, it is not an easy task to do. And with LTS releases we\'re talking\\nabout 5+ years of support.\\n\\n:::info Fedora\\n\\nTry to compare this state to Fedora. It also has a 6-month release cycle, but\\nthere are no LTS releases and each release is supported only for a year.\\n\\n:::\\n\\nCommon strategy, at this point, is to pull out the _open-source_. Yes, we are\\nstill dealing with the open-source, but keep in mind that you\'re trying to patch\\nsome issue in a version that\'s 5 years old, upstream definitely doesn\'t care\\nanymore[^4], the development didn\'t stop 5 years ago, it\'s going on and fixing\\nthis issue in a release from 5 years is not the same as fixing it in the current\\nrelease. At this point, if you are paying for such support, you are actually\\npaying for someone to do _software archaeology_ which **can be** _non-trivial_\\nto do.\\n\\nIn the case of Ubuntu Pro we\'re talking about community support and best-effort\\nsupport by Canonical for the paying customers. And that makes sense to me,\\nrunning LTS distro for 5+ years on a desktop seems like an odd choice, even\\nwith the help of _[podman]_ and _[distrobox]_ or _[toolbx]_ that allow us to use\\nstable or LTS distro as a base and containerized development environments on top\\nof that.\\n\\n## RHEL ecosystem\\n\\nRHEL ecosystem is much more complicated in this matter. However it\'s very\\nsimilar to the way SUSE operates with few exceptions.\\n\\nYou can see a flow diagram here:\\n\\n```mermaid\\nflowchart LR;\\n U[upstream] --\x3e FR[Fedora Rawhide];\\n FR --\x3e F[Fedora release];\\n F --\x3e C[CentOS Stream];\\n C --\x3e R[RHEL];\\n```\\n\\nKey things to take and not to take from the flow diagram:\\n\\n- getting from one upstream to its respective downstream is not as simple as the\\n presence of an arrow and it\'s not the same process for all of them\\n- lengths of the arrows are not proportional, specifically:\\n - Fedora Rawhide is _supposed to_ consume updates as soon as possible,\\n - depending on the decision of the maintainer they can, but _don\'t have to_ be\\n included in the currently supported Fedora releases (you can take [Emacs] as\\n an example of such package), but Rawhide eventually becomes the next Fedora\\n release,\\n - CentOS Stream gets branched off a specific Fedora release, and then\\n - ultimately CentOS Stream becomes the next **minor** release of RHEL.\\n- this diagram is simplified by **a lot**\\n\\n:::tip SUSE flow for comparison\\n\\nI\'ll also include a SUSE flow, so you can compare:\\n\\n```mermaid\\nflowchart LR;\\n U[upstream] --\x3e T[openSUSE Tumbleweed];\\n T --\x3e L[openSUSE Leap];\\n L --\x3e S[SUSE Linux Enterprise];\\n S --\x3e L;\\n```\\n\\nYou can notice, as opposed to the RHEL ecosystem, some changes are being\\nbackported to the openSUSE Leap.\\n\\nHowever this is subject to change as there is a new [ALP] project arising which\\nis, more than likely, going to replace the Leap.\\n\\n:::\\n\\n### Change in the model\\n\\nThe flow I\'ve shown above is in effect since late \u201820 and early \u201821. I hope you\\ncan see that it is quite similar to the way SUSE operates too. Before late \u201820\\nthe flow was following:\\n\\n```mermaid\\nflowchart LR;\\n U[upstream] --\x3e FR[Fedora Rawhide];\\n FR --\x3e F[Fedora release];\\n F --\x3e R[RHEL];\\n R --- C[CentOS];\\n```\\n\\nCentOS was the last distribution in that \u201cchain\u201d. This provides some benefits\\nand some negatives.\\n\\n#### Before the change\\n\\nFrom the point of a developer, unless you have some kind of an early access to\\nRHEL, you don\'t see the changes until they land and are already released. This\\nimpairs your ability to test and verify your software before shipping it to your\\nclients that use RHEL.\\n\\nFrom the point of a user, there is one positive, you basically get \u201cfree RHEL\u201d\\nwithout the support. This also allowed you to report bugs against the RHEL,\\nsince they were 1:1 distros (minus the branding and support). So you\'d\\ntechnically get RHEL free of charge.\\n\\nBenefit of such project, except for the cost, is questionable. The main issue,\\nwhich actually became even more apparent after changing the flow, is someone\\nelse repackaging your own product and selling it again.\\n\\n#### After the change\\n\\nFirst of all, the current flow counters the issue mentioned above. You can test\\nyour projects against the _next minor RHEL release_. CentOS Stream is free, so\\nyou can freely incorporate it into your CI pipelines.\\n\\n:::tip Shameless plug pt. 2\\n\\nAgain, [Packit] can help you on upstream to verify that you\'re not breaking your\\nRPM builds and on top of that you can also use [Testing Farm] to run tests on a\\nspecific Fedora or CentOS Stream releases.\\n\\n> Green tests may not be green everywhere and catching such issues as soon as\\n> possible costs much less than catching them further down the chain.\\n\\n:::\\n\\nThere are many people thinking that RHEL has become closed-source. It is not.\\nThe development happens _out in the open_, it\'s more open that it was before.\\nHowever with the cost of not getting the exact same thing for free. You can get\\nthe next minor RHEL, not the same that\'s normally paid for. [Packit] is an\\nexample of a service that is deployed on the CentOS 9 Stream and even used to be\\ndeployed on Fedora, but the regular 6-month release cycle caused some minor\\nissues here and there.\\n\\n_Production-ready_ is something that heavily depends on the context\u2026\\n\\n:::tip Free \u201cclones\u201d\\n\\nAfter this change so-called _free \u201cclones\u201d_ emerged. I have to admit that in\\ncase of _[AlmaLinux]_ I can see some benefits e.g., pushing for live images and\\nsupport of various desktop environments, Raspberry Pi support or even WSL images\\nbeing present in the M$ Store and easy to install.\\n\\n:::\\n\\n## Open-source and paid support\\n\\nOverall I don\'t think that paying for the support of 5 years old _non-critical_\\npackages is going against the open-source. It is a non-trivial work that, in\\nmajority of cases, cannot be included in the upstream, therefore the benefit is\\nreapt only by the paying customers. I have to admit that in the case of the\\nUbuntu Pro it may seem a bit weird (hiding patches behind the paywall). However\\nwe\'re still talking about rather big set of packages that will affect a minority\\nof server workloads, if any.\\n\\n## Glossary\\n\\n- _rolling release_ - continuously released without \u201csignificant milestones\u201d\\n\\n :::tip\\n\\n As an example of rolling distribution you can take archLinux, openSUSE\\n Tumbleweed, Fedora Rawhide, or even CentOS 9 Stream.\\n\\n As en example of **not** rolling distribution you can take Ubuntu, openSUSE\\n Leap or Fedora.\\n\\n :::\\n\\n- _bleeding edge_ - contains the latest versions as they are released on the\\n upstream\\n\\n :::tip\\n\\n As an example you can take archLinux, openSUSE Tumbleweed or Fedora Rawhide.\\n You can also notice how common it is to combine _rolling release_ with\\n _bleeding edge_.\\n\\n :::\\n\\n- _upstream_ & _downstream_\\n\\n You\'re most likely to meet these terms in the meaning of upstream being the\\n project itself and downstream being the packaging of said project in some\\n distribution.\\n\\n However this can also apply to distributions like _openSUSE Tumbleweed_ with\\n _openSUSE Leap_, _Fedora_ with _CentOS Stream_, or even _CentOS Stream_ with\\n _RHEL_. This basically means that the packages/software is being released into\\n the upstream (Tumbleweed, Fedora, or even CentOS) and then after being tested\\n is taken further down into their respective downstreams (Leap, CentOS, RHEL).\\n\\n[almalinux]: https://almalinux.org/\\n[alp]: https://susealp.io/\\n[distrobox]: https://distrobox.it/\\n[emacs]: https://src.fedoraproject.org/rpms/emacs/\\n[packit]: https://packit.dev/\\n[podman]: https://podman.io/\\n[testing farm]: https://docs.testing-farm.io/Testing%20Farm/0.1/index.html\\n[toolbx]: https://containertoolbx.org/\\n[ubuntu pro]: https://ubuntu.com/pro/\\n[zorin os]: https://zorin.com/os/pro/\\n\\n[^1]: Richard Stallman\\n[^2]: directed acyclic graph\\n[^3]:\\n Ubuntu Pro is technically a service whereas the RHEL and SLE are distros\\n with the support included.\\n\\n[^4]:\\n There are upstream projects that keep LTS branches, such as Linux kernel,\\n but even in the case of the kernel itself, they\'re planning on ending it,\\n since the cost outweighs the benefits at this point."},{"id":"/2024/01/28/rust-opinion","metadata":{"permalink":"/blog/2024/01/28/rust-opinion","editUrl":"https://github.com/mfocko/blog/tree/main/blog/2024-01-28-rust-opinion.md","source":"@site/blog/2024-01-28-rust-opinion.md","title":"Mixed feelings on Rust","description":"Discussing my mixed feelings about the Rust language.\\n","date":"2024-01-28T00:00:00.000Z","formattedDate":"January 28, 2024","tags":[{"label":"rust","permalink":"/blog/tags/rust"},{"label":"memory safety","permalink":"/blog/tags/memory-safety"},{"label":"cult","permalink":"/blog/tags/cult"},{"label":"hype","permalink":"/blog/tags/hype"}],"readingTime":15.395,"hasTruncateMarker":true,"authors":[{"name":"Matej Focko","email":"me+blog@mfocko.xyz","title":"a.k.a. passionate language hater","url":"https://gitlab.com/mfocko","imageURL":"https://github.com/mfocko.png","key":"mf"}],"frontMatter":{"title":"Mixed feelings on Rust","description":"Discussing my mixed feelings about the Rust language.\\n","date":"2024-01-28T00:00:00.000Z","authors":[{"key":"mf","title":"a.k.a. passionate language hater"}],"tags":["rust","memory safety","cult","hype"],"hide_table_of_contents":false},"unlisted":false,"prevItem":{"title":"LTS distributions","permalink":"/blog/2024/02/07/lts-distros"},"nextItem":{"title":"How can Copr help with broken dependencies","permalink":"/blog/2023/08/02/copr"}},"content":"Rust has become a rather popular language these days. I\'ve managed to get my\\nhands dirty with it during _[Advent of Code]_ \u201822 and partially \u201823. I\'ve also\\nused it for few rounds of _[Codeforces]_ and I have to try very hard to maintain\\nsome variety of languages for LeetCode challenges along with the Rust. I\'ll\\ndisclaim up front that I won\'t be only positive, since this post is a result of\\nmultiple discussions about Rust and I stand by\\n_\u201cAll that glitters is not gold\u201d_, so if you can\'t stand your favorite language\\nbeing criticized in any way, don\'t even proceed. :wink:\\n\\n\x3c!--truncate--\x3e\\n\\n## Memory safety\\n\\nI\'ll start by kicking the biggest benefit of the language, the memory safety.\\nLet\'s be honest here, majority of the checks rely on the static analysis, cause\\nyou can\'t do anything else during the compile-time, right? Therefore we can\\nbasically say that we are relying on the compiler to \u201csolve\u201d all of our issues.\\n\\n:::warning\\n\\nI\'m not doubting the fact that compiler can prevent **a lot** of the memory\\nerrors, I\'m just saying it\'s not realistic to cover **everything**.\\n\\n:::\\n\\n### Compiler\\n\\nI guess we can safely[^2] agree on the fact that we 100% rely on the compiler to\\n_have our back_. Is the compiler bug-free? I doubt it. This is not meant in an\\noffensive way to the Rust compiler developers, but we need to be realistic here.\\nIt\'s a compiler, even older and larger projects like _gcc_ or _llvm_ can\'t avoid\\nbugs to appear.\\n\\nWhen I was trying out Rust for some of the LeetCode challenges I\'ve stumbled\\nupon the following warning:\\n![Example of a compiler bug](https://i.imgur.com/NfPLF6o.png)\\n\\n:::danger [Issue](https://github.com/rust-lang/rust/issues/59159)\\n\\nThe issue here comes from the fact that we have 2 simultaneous references to the\\nsame memory (one is mutable and one immutable). If you cannot think of any way\\nthis can break, I\'ll give you a rather simple example from C++ where this could\\ncause an issue.\\n\\nImagine a function that has some complex object and also calls a coroutine which\\nutilizes read-only reference to that object. When the coroutine suspends, the\\ncaller can modify the object. This can break the integrity of data read by the\\ncoroutine.\\n\\n- Yes, this **can** cause a memory error.\\n- Yes, this **hasn\'t** been handled until someone noticed it.\\n\\nFixing this bug is not backwards compatible, cause you\'re covering a case that\\nhasn\'t been covered before.\\n\\n:::\\n\\n### Enforcing the safety\\n\\nOne of the ways Rust enforces the safety is by restricting what you can do, like\\nthe example above. Aforementioned issue _can_ happen, but **doesn\'t have to**.\\nRule of the thumb in the Rust compiler is to _\u201cblock\u201d_ anything that can be an\\nissue, static analysis can\'t do much more, it cannot decide whether it\'s safe to\\ndo it or not.\\n\\nSatisfying the Rust compiler is sometimes a brutal pain in the ass, because you\\ncannot do things like you\'re used to, you need to work around them _somehow_.\\n\\n:::tip\\n\\nKey difference between Rust and C or C++ lies in the fact that Rust chooses to\\n_ban_ all \u201cpotentially offensive\u201d actions, C and C++ _relies_ on **you** to be\\nsure it\'s safe to do.\\n\\n![C++ v. Rust](https://i.imgur.com/0vbkYPp.png)\\n\\n:::\\n\\n### Consequences\\n\\nWhere are we heading with this approach of \u201cif it compiles, it runs\u201d though?\\nIn this aspect I have a rather similar opinion as with regards to the ChatGPT\\nand its derivatives.\\n\\nIf you teach people to 100% depend on the compiler, they will do it, cause it\'s\\n_easy_. All you need to do is make the compiler _shut up_[^3]. Giving up the\\n_intellectual masturbation_ about the memory safety will make you lose your edge\\nover the time. When we get to the point of everyone being in the mindset\\nmentioned above, who\'s going to maintain the compiler? This is the place where\\nyou **need to** think about the memory safety and furthermore in a much more\\ngeneral way than in your own projects, because it is the thing that everyone\\n_blindly believes in_ in the end.\\n\\nI\'m not saying that everyone should give up Rust and think about their memory\\nmanagement and potential memory issues. I\'m just saying that going the easy way\\nwill make people _dull_ and they should think about it anyways, that\'s how the\\nissue above has been discovered. If everyone walked past and didn\'t think about\\nit, no one would discover this issue till it bit them hard.\\n\\n:::tip Standard library\\n\\nEven the standard library is littered with `unsafe` blocks that are prefixed\\nwith comments in style:\\n\\n```rs\\n// SAFETY: \u2026\\n```\\n\\nThe fact that the _casual_ Rust dev doesn\'t have to think much about safety,\\ncause the compiler has their back, doesn\'t mean that the Rust compiler dev\\ndoesn\'t either.\\n\\nI gotta admit that I adopted this concept in other languages (even in Python),\\ncause you can encounter situations where it doesn\'t have to be clear _why_ you\\ncan do _what_ you\'re doing.\\n\\n:::\\n\\n## Development & design\\n\\nDevelopment of Rust is\u2026 very fast. One positive is that they\'re trying to be as\\nbackward compatible as possible at least by verifying against all the published\\ncrates in the process. Of course, you cannot be backward compatible about fixing\\nthe bugs that have been found, but such is life.\\n\\n### Fast development cycle\\n\\nOne of the negatives of the fast development cycle is the fact that they\'re\\nusing the latest features already in the next release of the Rust. Yes, it is\\nsomething that you can use for verifying and testing your own changes, but at\\nthe same time it places a requirement of the latest release to compile the next\\none.\\n\\n:::tip\\n\\nIf you check `gcc` for example, they have a requirement of minimal version of\\ncompiler that you need for the build. Though gcc\'s requirement is not so _needy_\\nas the Rust one.\\n\\n:::\\n\\nOne of the other negatives is the introduction of bugs. If you\'re pushing\\nchanges, somewhat mindlessly, at such a fast pace, it is inevitable to introduce\\na bunch bugs in the process. Checking the GitHub issue tracker with\\n\\n```\\nis:issue is:open label:C-bug label:T-compiler\\n```\\n\\nyields **2,224** open issues at the time of writing this post.\\n\\n### RFCs\\n\\nYou can find **a lot** of RFCs for the Rust. Some of them are more questionable\\nthan the others. Fun thing is that a lot of them make it to the nightly builds,\\nso they can be tested and polished off. Even the questionable ones\u2026 I\'ll leave\\nfew examples for a better understanding.\\n\\nOne of such features is the `do yeet` expression:\\n\\n```rust\\n#![feature(yeet_expr)]\\n\\nfn foo() -> Result {\\n do yeet 4;\\n}\\nassert_eq!(foo(), Err(4));\\n\\nfn bar() -> Option {\\n do yeet;\\n}\\nassert_eq!(bar(), None);\\n```\\n\\nIt allows you to \u201cyeet\u201d the errors out of the functions that return `Result` or\\n`Option`.\\n\\n[One](https://github.com/rust-lang/rfcs/pull/3503) of the more recent ones is\\nthe ability to include Cargo manifests into the sources, so you can do something\\nlike:\\n\\n```rust\\n#!/usr/bin/env cargo\\n---\\n[dependencies]\\nclap = { version = \\"4.2\\", features = [\\"derive\\"] }\\n---\\n\\nuse clap::Parser;\\n\\n#[derive(Parser, Debug)]\\n#[clap(version)]\\nstruct Args {\\n #[clap(short, long, help = \\"Path to config\\")]\\n config: Option,\\n}\\n\\nfn main() {\\n let args = Args::parse();\\n println!(\\"{:?}\\", args);\\n}\\n```\\n\\nI would say you can get almost anything into the language\u2026\\n\\n## Community and hype train\\n\\nRust community is a rather unique thing. A lot of people will hate me for this,\\nbut I can\'t help, but to compare them to _militant vegans_. I\'ll go through some\\nof the things related to it, so I can support my opinion at least.\\n\\n_Rust is the best language._ It is not. There is no best language, each has its\\nown positives and negatives, you need to choose the language that\'s **the most**\\n**suitable for your use case**. There are areas where Rust excels, though I have\\nto admit it\'s very close to being a universal hammer regardless of how suitable\\nit is. There is a very steep learning curve to it, beginnings in Rust are very\\npainful.\\n\\n_Rewrite everything in Rust._ Just no. There are multiple feedbacks on doing\\nrewrites, it is very common to fix _N_ bugs with a rewrite while introducing\\n_N + 1_ other bugs in the process. It doesn\'t solve anything unless there are\\nsome strong reasons to go with it. Majority of such suggested rewrites don\'t\\nhave those reasons though.\\n\\n_Language \u2039x\u203a is bad, though in Rust\u2026_ Cherry-picking one specific pain point of\\none language and reflecting how it is better in other language can go both ways.\\nFor example it is rather easy to pick the limitations imposed by Rust compiler\\nand show how it\'s possible in other languages :man_shrugging:\\n\\nI don\'t mind any of those opinions, you\'re free to have them, as long as you\\ndon\'t rub them in my face which is not the usual case\u2026 This experience makes it\\njust worse for me, part of this post may be also influenced by this fact.\\n\\n### Rust in Linux\\n\\n:::caution\\n\\nAs someone who has seen the way Linux kernel is built in the RHEL ecosystem, how\\ncomplex the whole thing is and how much resources you need to proceed, I have\\nvery strong opinions on this topic.\\n\\n:::\\n\\nIt took years of work to even \u201cincorporate\u201d Rust into the Linux codebase, just\\nto get the \u201cHello World!\u201d. I don\'t have anything against the idea of writing\\ndrivers in the Rust, I bet it can catch a lot of common mistakes, but still\\nintroducing Rust to the kernel is another step to enlarge the monster.\\n\\nI have to admit though that the _Apple GPU_ driver for Linux written in Rust is\\nquite impressive. Apart from that there are not so many benefits, yet\u2026\\n\\n## Packaging\\n\\nI\'ll divide the packaging into the packaging of the language itself and the\\nprograms written in Rust.\\n\\nLet\'s start with the `cargo` itself though. Package managers of the languages\\nusually get a lot of hate (you can take `npm` or `pip` as examples[^1]). If\\nyou\'ve ever tried out Rust, I bet you already know where I\'m going with this.\\nYes, I mean the compilation times, or even Cargo downloading _whole_ index of\\ncrates just so you can update that one dependency (and 3 millions of indirect\\ndeps). When I was doing AoC \u201822 in Rust, I\'ve set up `sccache` right away on the\\nfirst day.\\n\\nLet\'s move to the packaging of the Rust itself, it\'s tedious. Rust has a very\\nfast development cycle and doesn\'t even try to make the builds backward\\ncompatible. If there is a new release of Rust, there is a very high chance that\\nyou cannot build that release with anything other than **the latest** Rust\\nrelease. If you have ever touched the packaging, you know that this is something\\nthat can cause a lot of problems, cause you need the second-to-latest version to\\ncompile the latest version, don\'t forget that this applies inductively\u2026 People\\nrunning _Gentoo_ could tell you a lot about this.\\n\\n:::info\\n\\nCompiling the compilers takes usually more time than compiling the kernel\\nitself\u2026\\n\\n:::\\n\\nI cannot speak about packaging of Rust programs in other than RHEL-based\\ndistros, though I can speak about RHEL ecosystem. Fedora packaging guidelines\\nspecify that you need to build each and every dependency of the program\\nseparately. I wanted to try out _AlmaLinux_ and install Alacritty there and I\\nfailed miserably. The solution that worked, consisted of ignoring the packaging\\nguidelines, running `cargo build` and consuming the binaries afterwards.\\nDependencies of the Rust programs are of a similar nature as JS dependencies.\\n\\n> I\'m tipping my fedora[^2] in the general direction of the maintainers of Rust\\n> packages in RHEL ecosystem. I wouldn\'t be able to do this without losing my\\n> sanity.\\n\\n## Likes\\n\\nIf you\'ve come all the way here and you\'re a Rustacean, I believe I\'ve managed\\nto get your blood boiling, so it\'s time to finish this off by stuff I like about\\nRust. I doubt I will be able to cover everything, but I can try at least. You\\nhave to admit it\'s much easier to remember the bad stuff as opposed to the good.\\n:wink:\\n\\n### Workflow and toolchain\\n\\nI prefered using Rust for the _Advent of Code_ and _Codeforces_ as it provides\\na rather easy way to test the solutions before running them with the challenge\\ninput (or test runner). I can give an example from the _Advent of Code_:\\n\\n```rust\\nuse aoc_2023::*;\\n\\ntype Output1 = i32;\\ntype Output2 = Output1;\\n\\nstruct DayXX {}\\nimpl Solution for DayXX {\\n fn new>(pathname: P) -> Self {\\n let lines: Vec = file_to_lines(pathname);\\n\\n todo!()\\n }\\n\\n fn part_1(&mut self) -> Output1 {\\n todo!()\\n }\\n\\n fn part_2(&mut self) -> Output2 {\\n todo!()\\n }\\n}\\n\\nfn main() -> Result<()> {\\n DayXX::main()\\n}\\n\\ntest_sample!(day_XX, DayXX, 42, 69);\\n```\\n\\nThis was the skeleton I\'ve used and the macro at the end is my own creation that\\nexpands to:\\n\\n```rust\\n#[cfg(test)]\\nmod day_XX {\\n use super::*;\\n\\n #[test]\\n fn part_1() {\\n let path = DayXX::get_sample(1);\\n let mut day = DayXX::new(path);\\n assert_eq!(day.part_1(), 42);\\n }\\n\\n #[test]\\n fn part_2() {\\n let path = DayXX::get_sample(2);\\n let mut day = DayXX::new(path);\\n assert_eq!(day.part_2(), 69);\\n }\\n}\\n```\\n\\nWhen you\'re solving the problem, all you need to do is switch between\\n`cargo test` and `cargo run` to check the answer to either sample or the\\nchallenge input itself.\\n\\nIntroduce [bacon] and it gets even better. Bacon is a CLI tool that wraps around\\nthe `cargo` and allows you to check, run, lint or run tests on each file save.\\nIt\'s a very pleasant thing for a so-called _compiler-assisted_ development.\\n\\nSpeaking of linting from within the bacon, you cannot leave out the [clippy].\\nNot only it can whip your ass because of errors, but it can also produce a lot\\nof helpful suggestions, for example passing slices by borrow instead of\\nborrowing the `Vec` itself when you don\'t need it.\\n\\n### Standard library\\n\\nThere\'s **a lot** included in the standard library. It almost feels like you\\nhave all you need[^4]. I like placeholders (like `todo!()`, `unreachable!()`,\\n`unimplemented!()`) to the extent of\\n[implementing](/cpp/exceptions-and-raii/placeholders) them as exceptions in C++.\\n\\nYou can find almost anything. Though you can also hit some very weird issues\\nwith some of the nuances of the type system.\\n\\n### `unsafe`\\n\\nThis might be something that people like to avoid as much as possible. However I\\nthink that forming a habit of commenting posibly unsafe operations in **any**\\nlanguage is a good habit, as I\'ve mentioned above. You should be able to argue\\nwhy you can do something safely, even if the compiler is not kicking your ass\\nbecause of it.\\n\\nExcerpt of such comment from work:\\n\\n```py\\n# SAFETY: Taking first package instead of specific package should be\\n# safe, since we have put a requirement on \xbbone\xab \u2039upstream_project_url\u203a\\n# per Packit config, i.e. even if we\'re dealing with a monorepo, there\\n# is only \xbbone\xab upstream. If there is one upstream, there is only one\\n# set of GPG keys that can be allowed.\\nreturn self.downstream_config.packages[\\n self.downstream_config._first_package\\n].allowed_gpg_keys\\n```\\n\\n### Traits\\n\\nOne of the other things I like are the traits. They are more restrictive than\\ntemplates or concepts in C++, but they\'re doing their job pretty good. If you\\nare building library and require multiple traits to be satisfied it means a lot\\nof copy-paste, but that\'s soon to be fixed by the [trait aliases].\\n\\n:::tip Comparing to other languages\\n\\nOn Wikipedia I\'ve seen trait being defined as a more restrictive type class as\\nyou may know it from the Haskell for example. C++ isn\'t behind either with its\\n_constraints and concepts_. I would say that we can order them in the following\\norder based on the complexity they can express:\\n\\n```\\nRust\'s trait < Haskell\'s type class < C++\'s concept\\n```\\n\\n:::\\n\\nYou can also hit some issues, like me when trying to support conversions between\\nunderlying numeric types of a 2D vectors or support for using an operator from\\nboth sides (I couldn\'t get `c * u` to work in the same way as `u * c` because\\nthe first one requires you to implement the trait of a built-in type).\\n\\n:::warning Implementation\\n\\nImplementing traits lies in\\n\\n```rust\\nimpl SomeTrait for SomeStruct {\\n // implementation goes here\\n}\\n```\\n\\nOne of the things I **would love to** see is being able to define the helper\\nfunctions within the same block. As of now, the only things allowed are the ones\\nthat are required by the trait, which in the end results in a randomly lying\\nfunctions around (or in a implementation of the structure itself). I don\'t like\\nthis mess at all\u2026\\n\\n:::\\n\\n### Influence of functional paradigm\\n\\nYou can see a big influence of the functional paradigm. Not only in iterators,\\nbut also in the other parts of the language. For example I prefer `Option` or\\n`Result` to `null`s and exceptions. Pattern matching together with\\ncompiler both enforces handling of the errors and rather user-friendly way of\\ndoing it.\\n\\nNot to mention `.and_then()` and such. However spending most of the time with\\nthe AoC you get pretty annoyed of the repetitive `.unwrap()` during parsing,\\nsince you are guaranteed correct input.\\n\\n### Macros\\n\\nMacros are a very strong pro of the Rust. And no, we\'re not going to talk about\\nthe procedural macros\u2026\\n\\nAs I\'ve shown above I\'ve managed to \u201ctame\u201d a lot of copy-paste in the tests for\\nthe AoC by utilizing a macro that generated a very basic template for the tests.\\n\\nAs I have mentioned the traits above, I cannot forget to give props to `derive`\\nmacro that allows you to \u201cdeduce\u201d the default implementation. It is very helpful\\nfor a tedious tasks like implementing `Debug` (for printing out the structures)\\nor comparisons, though with the comparisons you need to be careful about the\\ndefault implementation, it has already bitten me once or twice.\\n\\n## Summary\\n\\nOverall there are many things about the Rust I like and would love to see them\\nimplemented in other languages. However there are also many things I don\'t like.\\nNothing is **exclusively** black and white.\\n\\n[advent of code]: https://adventofcode.com\\n[bacon]: https://dystroy.org/bacon/\\n[clippy]: https://github.com/rust-lang/rust-clippy\\n[codeforces]: https://codeforces.com\\n[trait aliases]: https://github.com/rust-lang/rfcs/blob/master/text/1733-trait-alias.md\\n\\n[^1]:\\n not to even mention multiple different packaging standards Python has, which\\n is borderline https://xkcd.com/927/\\n\\n[^2]: pun intended\\n[^3]: It\'s not that easy with the Rust compiler, but OK\u2026\\n[^4]:\\n unlike Python where there\'s whole universe in the language itself, yet there\\n are essential things not present\u2026"},{"id":"/2023/08/02/copr","metadata":{"permalink":"/blog/2023/08/02/copr","editUrl":"https://github.com/mfocko/blog/tree/main/blog/2023-08-02-copr.md","source":"@site/blog/2023-08-02-copr.md","title":"How can Copr help with broken dependencies","description":"Copr comes to save you when maintainer doesn\'t care.","date":"2023-08-02T00:00:00.000Z","formattedDate":"August 2, 2023","tags":[{"label":"\ud83c\udfed","permalink":"/blog/tags/\ud83c\udfed"},{"label":"red-hat","permalink":"/blog/tags/red-hat"},{"label":"copr","permalink":"/blog/tags/copr"},{"label":"admin","permalink":"/blog/tags/admin"},{"label":"vps","permalink":"/blog/tags/vps"}],"readingTime":3.44,"hasTruncateMarker":true,"authors":[{"name":"Matej Focko","email":"me+blog@mfocko.xyz","title":"a.k.a. your opinionated admin","url":"https://gitlab.com/mfocko","imageURL":"https://github.com/mfocko.png","key":"mf"}],"frontMatter":{"title":"How can Copr help with broken dependencies","description":"Copr comes to save you when maintainer doesn\'t care.","date":"2023-08-02T00:00:00.000Z","authors":[{"key":"mf","title":"a.k.a. your opinionated admin"}],"tags":["\ud83c\udfed","red-hat","copr","admin","vps"]},"unlisted":false,"prevItem":{"title":"Mixed feelings on Rust","permalink":"/blog/2024/01/28/rust-opinion"},"nextItem":{"title":"4th week of Advent of Code \'22 in Rust","permalink":"/blog/aoc-2022/4th-week"}},"content":"When you decide to run Fedora on your VPS, you might get screwed over by using\\nrandom repositories\u2026\\n\\n\x3c!--truncate--\x3e\\n\\nWhen I \u201creserved\u201d my VPS[^1] back in June \'20, I slapped Fedora on it without\\nthinking. I bet 99% of people would say that I\'m crazy for doing such thing[^2],\\n**BUT** I\'ve been using Fedora on my PCs for some time already and it felt very\\nstable and natural to just use, even for VPS.\\n\\nOne of the first things I\'ve done was setting up a mail server. You may guess\\nwhat\'s the fun part about having a mail server\u2026 Yes, it\'s all the spam you\\nreceive and only then you realize how much \u201ccrap\u201d gets filtered on free mail\\nservices. To battle this problem I chose to use\\n[rspamd](https://github.com/rspamd/rspamd) that had CentOS support, but someone\\nhad a [Copr](https://copr.fedorainfracloud.org/) repository that I used to\\ninstall it.\\n\\n## How does Copr repositories work?\\n\\nIf you have ever used Ubuntu, you might be familiar with the concept since it is\\nvery close to [PPAs](https://help.ubuntu.com/community/PPA).\\n\\ntl;dr of the whole process consists of\\n\\n1. enabling the Copr repository, and\\n2. installing the desired package.\\n\\nSo in shell you would do\\n\\n```\\n# dnf copr enable \u2039copr-repository\u203a\\n# dnf install \u2039package-from-the-repository\u203a\\n```\\n\\nAnd\u2026 that\'s it! Nothing else needed! Simple, right? And literally same process\\nas you would do for the PPA.\\n\\n:::tip AUR\\n\\nOn the other hand, if you are familiar with the archLinux, you definitely know\\nAUR and what it can do for you. Copr repository is pretty similar, but the\\npackages are prebuilt in Copr and Copr repositories can carry the required\\ndependencies for said packages, which simplifies the distribution, and can even\\nhelp with installing singular packages (when you just need the dependency, not\\neverything).\\n\\n:::\\n\\n## My issue\\n\\nNow you might wonder how would I use it on my VPS. It\'s rather simple, once in\\n6 months a new Fedora release comes out. And you need to upgrade to newer\\nrelease\u2026 You don\'t need to do it right away and for such setup it probably isn\'t\\neven recommended.\\n\\n:::tip\\n\\nFedora releases are supported for a year, i.e. they live 6 months till the next\\nrelease and then another 6 months till another release.\\n\\nSome people prefer to run one version \u201cbehind\u201d. If you ever decide to run it on\\nyour home server or in a similar setup, it might be a pretty good idea to\\nfollow. I\'m using the \u201clatest greatest\u201d, cause why not :smile:\\n\\nOne way or another, you still need to bump the release every six months, unless\\nyou\'d bump 2 releases at once every year, which would be a decision, since, at\\nleast I, cannot see any benefits in it\u2026 You don\'t go for \u201cstability\u201d, cause once\\na year you switch to the latest release and then, before you bump, you use one\\nyear old software, so you\'re not even using the latest.\\n\\n:::\\n\\nFast-forward 2 years in the future, new Fedora release came out (October \'22)\\nand I was doing an upgrade. Dependencies of the rspamd have been updated and\\nrspamd builds in Copr have failed and no one fixed it. Cool, so now I can\\nupgrade, but can either ignore the dependencies or uninstall the rspamd\u2026\\n\\n## How can Copr help?\\n\\nI have managed to find\\n[specfile](https://github.com/rspamd/rspamd/blob/master/rpm/rspamd.spec) for the\\nrspamd package that they use for CentOS. There were some files apart from the\\nspecfile, so I had to make an SRPM locally and then\u2026 I just uploaded the SRPM\\nto the Copr to\\n[build](https://copr.fedorainfracloud.org/coprs/mfocko/rspamd/build/5046567/)\\nan RPM.\\n\\nI have switched the previous Copr repository for rspamd with my own and happily\\nproceeded with the upgrade.\\n\\n## Conclusion\\n\\nCopr is heavily used for testing builds on the upstream with\\n[Packit](https://packit.dev). However, as you can see, it is possible to use it\\n**very well** for packaging your own stuff and avoiding issues (such as the one\\nI have described above), if need be.\\n\\n[^1]: [vpsFree.cz](https://vpsfree.cz)\\n[^2]:\\n Even though I\'ve been running archLinux on some Raspberry Pi\'s and also\\n on one of my \u201chome servers\u201d, before getting the VPS. You could say I like\\n to live on the edge\u2026"},{"id":"aoc-2022/4th-week","metadata":{"permalink":"/blog/aoc-2022/4th-week","editUrl":"https://github.com/mfocko/blog/tree/main/blog/aoc-2022/04-week-4.md","source":"@site/blog/aoc-2022/04-week-4.md","title":"4th week of Advent of Code \'22 in Rust","description":"Surviving fourth week in Rust.","date":"2023-07-07T15:14:00.000Z","formattedDate":"July 7, 2023","tags":[{"label":"advent-of-code","permalink":"/blog/tags/advent-of-code"},{"label":"advent-of-code-2022","permalink":"/blog/tags/advent-of-code-2022"},{"label":"rust","permalink":"/blog/tags/rust"}],"readingTime":15.175,"hasTruncateMarker":true,"authors":[{"name":"Matej Focko","email":"me+blog@mfocko.xyz","title":"a.k.a. @mf","url":"https://gitlab.com/mfocko","imageURL":"https://github.com/mfocko.png","key":"mf"}],"frontMatter":{"title":"4th week of Advent of Code \'22 in Rust","description":"Surviving fourth week in Rust.","date":"2023-07-07T15:14","slug":"aoc-2022/4th-week","authors":"mf","tags":["advent-of-code","advent-of-code-2022","rust"],"hide_table_of_contents":false},"unlisted":false,"prevItem":{"title":"How can Copr help with broken dependencies","permalink":"/blog/2023/08/02/copr"},"nextItem":{"title":"3rd week of Advent of Code \'22 in Rust","permalink":"/blog/aoc-2022/3rd-week"}},"content":"Let\'s go through the fourth week of [_Advent of Code_] in Rust.\\n\\n\x3c!--truncate--\x3e\\n\\n## [Day 22: Monkey Map](https://adventofcode.com/2022/day/22)\\n\\n:::info tl;dr\\n\\nSimulating a movement on a 2D map with given instructions. Map becomes a cube in\\nthe 2nd part\u2026\\n\\n:::\\n\\n:::caution Rant\\n\\nThis was the most obnoxious problem of this year\u2026 and a lot of Rust issues have\\nbeen hit.\\n\\n:::\\n\\n### Solution\\n\\nIt seems like a very simple problem to solve, but with very obnoxious changes in\\nthe 2nd part and also it\'s relatively hard to decompose \xbbproperly\xab.\\n\\n#### Column iterator\\n\\nIn the first part of the problem it was needed to know the boundaries of each\\nrow and column, since I stored them in `Vec>` and padded with spaces\\nto ensure I have a rectangular 2D \u201carray\u201d. However when you wanted to go through\\neach row and column to determine the boundaries, it was very easy to do for the\\nrows (cause each row is a `Vec` element), but not for the columns, since they\\nspan multiple rows.\\n\\nFor this use case I have implemented my own _column iterator_:\\n\\n```rust\\npub struct ColumnIterator<\'a, T> {\\n map: &\'a [Vec],\\n column: usize,\\n\\n i: usize,\\n}\\n\\nimpl<\'a, T> ColumnIterator<\'a, T> {\\n pub fn new(map: &\'a [Vec], column: usize) -> ColumnIterator<\'a, T> {\\n Self { map, column, i: 0 }\\n }\\n}\\n\\nimpl<\'a, T> Iterator for ColumnIterator<\'a, T> {\\n type Item = &\'a T;\\n\\n fn next(&mut self) -> Option {\\n if self.i >= self.map.len() {\\n return None;\\n }\\n\\n self.i += 1;\\n Some(&self.map[self.i - 1][self.column])\\n }\\n}\\n```\\n\\nGiven this piece of an iterator, it is very easy to factor out the common\\nfunctionality between the rows and columns into:\\n\\n```rust\\nlet mut find_boundaries = |constructor: fn(usize) -> Orientation,\\n iterator: &mut dyn Iterator,\\n upper_bound,\\n i| {\\n let mut first_non_empty = iterator.enumerate().skip_while(|&(_, &c)| c == \' \');\\n let start = first_non_empty.next().unwrap().0 as isize;\\n\\n let mut last_non_empty = first_non_empty.skip_while(|&(_, &c)| c != \' \');\\n let end = last_non_empty.next().unwrap_or((upper_bound, &\'_\')).0 as isize;\\n\\n boundaries.insert(constructor(i), start..end);\\n};\\n```\\n\\nAnd then use it as such:\\n\\n```rust\\n// construct all horizontal boundaries\\n(0..map.len()).for_each(|row| {\\n find_boundaries(\\n Orientation::horizontal,\\n &mut map[row].iter(),\\n map[row].len(),\\n row,\\n );\\n});\\n\\n// construct all vertical boundaries\\n(0..map[0].len()).for_each(|col| {\\n find_boundaries(\\n Orientation::vertical,\\n &mut ColumnIterator::new(&map, col),\\n map.len(),\\n col,\\n );\\n});\\n```\\n\\n#### Walking around the map\\n\\nOnce the 2nd part got introduced, you start to think about a way how not to\\ncopy-paste a lot of stuff (I haven\'t avoided it anyways\u2026). In this problem, I\'ve\\nchosen to introduce a trait (i.e. _interface_) for 2D and 3D walker.\\n\\n```rust\\ntrait Wrap: Clone {\\n type State;\\n\\n // simulation\\n fn is_blocked(&self) -> bool;\\n fn step(&mut self, steps: isize);\\n fn turn_left(&mut self);\\n fn turn_right(&mut self);\\n\\n // movement\\n fn next(&self) -> (Self::State, Direction);\\n\\n // final answer\\n fn answer(&self) -> Output;\\n}\\n```\\n\\nEach walker maintains its own state and also provides the functions that are\\nused during the simulation. The \u201cpromised\u201d methods are separated into:\\n\\n- _simulation_-related: that are used during the simulation from the `.fold()`\\n- _movement_-related: just a one method that holds most of the logic differences\\n between 2D and 3D\\n- _final answer_: which extracts the _proof of solution_ from the\\n implementation-specific walker\\n\\nBoth 2D and 3D versions borrow the original input and therefore you must\\nannotate the lifetime of it:\\n\\n```rust\\nstruct Wrap2D<\'a> {\\n input: &\'a Input,\\n position: Position,\\n direction: Direction,\\n}\\nimpl<\'a> Wrap2D<\'a> {\\n fn new(input: &\'a Input) -> Wrap2D<\'a> {\\n// \u2026\\n```\\n\\n#### Problems\\n\\nI have used a lot of closures for this problem and once I introduced a parameter\\nthat was of unknown type (apart from the fact it implements a specific trait), I\\ngot suggested a \u201cfix\u201d for the compilation error that resulted in something that\\nwas not possible to parse, cause it, more than likely, violated the grammar.\\n\\nIn a similar fashion, I have been suggested changes that led to a code that\\ndidn\'t make sense by just looking at it (there was no need to try the changes),\\nfor example one suggested change in the closure parameter caused disapperance of\\nthe parameter name. :smile:\\n\\n#### Clippy\\n\\nI have to admit that Clippy was rather helpful here, I\'ll include two examples\\nof rather smart suggestions.\\n\\nWhen writing the parsing for this problem, the first thing I have spotted on the\\n`char` was the `.is_digit()` function that takes a radix as a parameter. Clippy\\nnoticed that I use `radix = 10` and suggested switching to `.is_ascii_digit()`\\nthat does exactly the same thing:\\n\\n```diff\\n- .take_while(|c| c.is_digit(10))\\n+ .take_while(|c| c.is_ascii_digit())\\n```\\n\\nAnother useful suggestion appeared when working with the iterators and I wanted\\nto get the $n$-th element from it. You know the `.skip()`, you know the\\n`.next()`, just \u201cslap\u201d them together and we\'re done for :grin: Well, I got\\nsuggested to use `.nth()` that does exactly the combination of the two mentioned\\nmethods on iterators:\\n\\n```diff\\n- match it.clone().skip(skip).next().unwrap() {\\n+ match it.clone().nth(skip).unwrap() {\\n```\\n\\n## [Day 23: Unstable Diffusion](https://adventofcode.com/2022/day/23)\\n\\n:::info tl;dr\\n\\nSimulating movement of elves around with a set of specific rules.\\n\\n:::\\n\\n### Solution\\n\\nThere\'s not much to mention since it\'s just a cellular automaton simulation\\n(even though the AoC rules for cellular automatons usually get out of hand\\n:wink:).\\n\\nAlthough I had a need to determine boundaries of the elves\' positions and ended\\nup with a nasty DRY violation. Knowing that you you\'re looking for maximum and\\nminimum that are, of course, exactly the same except for initial values and\\ncomparators, it looks like a rather simple fix, but typing in Rust is something\\nelse, right? In the end I settled for a function that computes both boundaries\\nwithout any duplication while using a closure:\\n\\n```rust\\nfn get_bounds(positions: &Input) -> (Vector2D, Vector2D) {\\n let f = |init, cmp: &dyn Fn(isize, isize) -> isize| {\\n positions\\n .iter()\\n .fold(Vector2D::new(init, init), |acc, elf| {\\n Vector2D::new(cmp(acc.x(), elf.x()), cmp(acc.y(), elf.y()))\\n })\\n };\\n\\n (f(isize::MAX, &min::), f(isize::MIN, &max::))\\n}\\n```\\n\\nThis function returns a pair of 2D vectors that represent opposite points of the\\nbounding rectangle of all elves.\\n\\nYou might ask why would we need a closure and the answer is that `positions`\\ncannot be captured from within the nested function, only via closure. One more\\nfun fact on top of that is the type of the comparator\\n\\n```rust\\n&dyn Fn(isize, isize) -> isize\\n```\\n\\nOnce we remove the `dyn` keyword, compiler yells at us and also includes a way\\nhow to get a more thorough explanation of the error by running\\n\\n $ rustc --explain E0782\\n\\nwhich shows us\\n\\n Trait objects must include the `dyn` keyword.\\n\\n Erroneous code example:\\n\\n ```\\n trait Foo {}\\n fn test(arg: Box) {} // error!\\n ```\\n\\n Trait objects are a way to call methods on types that are not known until\\n runtime but conform to some trait.\\n\\n Trait objects should be formed with `Box`, but in the code above\\n `dyn` is left off.\\n\\n This makes it harder to see that `arg` is a trait object and not a\\n simply a heap allocated type called `Foo`.\\n\\n To fix this issue, add `dyn` before the trait name.\\n\\n ```\\n trait Foo {}\\n fn test(arg: Box) {} // ok!\\n ```\\n\\n This used to be allowed before edition 2021, but is now an error.\\n\\n:::danger Rant\\n\\nNot all of the explanations are helpful though, in some cases they might be even\\nmore confusing than helpful, since they address _very simple_ use cases.\\n\\nAs you can see, even in this case there are two sides to the explanations:\\n\\n- it explains why you need to use `dyn`, but\\n- it still mentions that trait objects need to be heap-allocated via `Box`\\n that, as you can see in my snippet, **does not** apply here :smile: IMO it\'s\\n caused by the fact that we are borrowing it and therefore we don\'t need to\\n care about the size or whereabouts of it.\\n\\n:::\\n\\n:::info C++ parallel\\n\\nIf you dive into the explanation above, you can notice that the `Box`\\npattern is very helpful for using types that are not known during compile-time.\\nYou would use a very similar approach in C++ when parsing some data structure\\nfrom input (let\'s say JSON for example).\\n\\nOn the other hand, in this case, it doesn\'t really make much sense, cause you\\ncan clearly see that the types **are known** during the compile-time, which in\\nC++ could be easily resolved by templating the helper function.\\n\\n:::\\n\\n## [Day 24: Blizzard Basin](https://adventofcode.com/2022/day/24)\\n\\n:::info tl;dr\\n\\nNavigating your way through a basin with series of blizzards that move around\\nyou as you move.\\n\\n:::\\n\\n:::caution\\n\\nIt\'s second to last day and I went \u201c_bonkers_\u201d on the Rust :smile: Proceed to\\nread _Solution_ part on your own risk.\\n\\n:::\\n\\n### Solution\\n\\nYou are given a map with blizzards all over the place and you\'re supposed to\\nfind the minimum time it requires you to walk through the basin without getting\\nin any of the blizzards.\\n\\n#### Breakdown\\n\\nRelatively simple, yet a bit annoying, approach can be taken. It\'s technically\\na shortest-path algorithm implementation with some relaxation restrictions and\\nbeing able to stay on one position for some time, so each _vertex_ of the graph\\nis determined by the position on the map and the _timestamp_. I have chosen to\\nuse `Vector3D`, since `x` and `y` attributes can be used for the position\\nand, well, let\'s use `z` for a timestamp, cause why not, right? :wink:\\n\\n#### Evaluating the blizzards\\n\\n:::caution\\n\\nI think that this is the most perverted abuse of the traits in the whole 4 weeks\\nof AoC in Rust\u2026\\n\\n:::\\n\\nThe blizzards move along their respective directions in time and loop around in\\ntheir respective row/column. Each vertex holds position **and** time, so we can\\n_just_ index the basin with the vertex itself, right? Yes, we can :smiling_imp:\\n\\n:::tip Fun fact\\n\\nWhile writing this part, I\'ve recognized unnecessary verbosity in the code and\\ncleaned it up a bit. The changed version is shown here and the original was just\\nmore verbose.\\n\\n:::\\n\\nI\'ll skip the boring parts of checking bounds and entry/exit of the basin :wink:\\nWe can easily calculate positions of the blizzards using a modular arithmetics:\\n\\n```rust\\nimpl Index for Basin {\\n type Output = char;\\n\\n fn index(&self, index: Position) -> &Self::Output {\\n // \u2039skipped boring parts\u203a\\n\\n // We need to account for the loops of the blizzards\\n let width = self.cols - 2;\\n let height = self.rows - 2;\\n\\n let blizzard_origin = |size, d, t, i| ((i - 1 + size + d * (t % size)) % size + 1) as usize;\\n [\\n (\\n index.y() as usize,\\n blizzard_origin(width, -1, index.z(), index.x()),\\n \'>\',\\n ),\\n (\\n index.y() as usize,\\n blizzard_origin(width, 1, index.z(), index.x()),\\n \'<\',\\n ),\\n (\\n blizzard_origin(height, -1, index.z(), index.y()),\\n index.x() as usize,\\n \'v\',\\n ),\\n (\\n blizzard_origin(height, 1, index.z(), index.y()),\\n index.x() as usize,\\n \'^\',\\n ),\\n ]\\n .iter()\\n .find_map(|&(y, x, direction)| {\\n if self.map[y][x] == direction {\\n Some(&self.map[y][x])\\n } else {\\n None\\n }\\n })\\n .unwrap_or(&\'.\')\\n }\\n}\\n```\\n\\nAs you can see, there is an expression for calculating the original position and\\nit\'s used multiple times, so why not take it out to a lambda, right? :wink:\\n\\nI couldn\'t get the `rustfmt` to format the `for`-loop nicely, so I\'ve just\\ndecided to go with iterating over an elements of a slice. I have used, once\\nagain, a combination of two functions (`find_map` in this case) to do 2 things\\nat once and at the end, if we haven\'t found any blizzard, we just return the\\nempty space.\\n\\nI think it\'s a very _nice_ (and naughty) way how to use the `Index` trait, don\'t\\nyou think?\\n\\n#### Shortest-path algorithm\\n\\nFor the shortest path you can choose and adjust any of the common shortest-path\\nalgorithms, in my case, I have decided to use [_A\\\\*_] instead of Dijkstra\'s\\nalgorithm, since it better reflects the _cost_ function.\\n\\n:::info Comparison of costs\\n\\nWith the Dijkstra\'s algorithm I would proceed with the `time` attribute used as\\na priority for the queue.\\n\\nWhereas with the _A\\\\*_, I have chosen to use both time and Manhattan distance\\nthat promotes vertices closer to the exit **and** with a minimum time taken.\\n\\n:::\\n\\nCost function is, of course, a closure :wink:\\n\\n```rust\\nlet cost = |p: Position| p.z() as usize + exit.y().abs_diff(p.y()) + exit.x().abs_diff(p.x());\\n```\\n\\nAnd also for checking the possible moves from the current vertex, I have\\nimplemented, yet another, closure that yields an iterator with the next moves:\\n\\n```rust\\nlet next_positions = |p| {\\n [(0, 0, 1), (0, -1, 1), (0, 1, 1), (-1, 0, 1), (1, 0, 1)]\\n .iter()\\n .filter_map(move |&(x, y, t)| {\\n let next_p = p + Vector3D::new(x, y, t);\\n\\n if basin[next_p] == \'.\' {\\n Some(next_p)\\n } else {\\n None\\n }\\n })\\n};\\n```\\n\\n#### Min-heap\\n\\nIn this case I had a need to use the priority queue taking the elements with the\\nlowest cost as the prioritized ones. Rust only offers you the [`BinaryHeap`] and\\nthat is a max-heap. One of the ways how to achieve a min-heap is to put the\\nelements in wrapped in a [`Reverse`] (as is even showed in the linked [docs of\\nthe `BinaryHeap`]). However the wrapping affects the type of the heap and also\\npopping the most prioritized elements yields values wrapped in the `Reverse`.\\n\\nFor this purpose I have just taken the max-heap and wrapped it as a whole in a\\nseparate structure providing just the desired methods:\\n\\n```rust\\nuse std::cmp::{Ord, Reverse};\\nuse std::collections::BinaryHeap;\\n\\npub struct MinHeap {\\n heap: BinaryHeap>,\\n}\\n\\nimpl MinHeap {\\n pub fn new() -> MinHeap {\\n MinHeap {\\n heap: BinaryHeap::new(),\\n }\\n }\\n\\n pub fn push(&mut self, item: T) {\\n self.heap.push(Reverse(item))\\n }\\n\\n pub fn pop(&mut self) -> Option {\\n self.heap.pop().map(|Reverse(x)| x)\\n }\\n}\\n\\nimpl Default for MinHeap {\\n fn default() -> Self {\\n Self::new()\\n }\\n}\\n```\\n\\nRest is just the algorithm implementation which is not that interesting.\\n\\n## [Day 25: Full of Hot Air](https://adventofcode.com/2022/day/25)\\n\\n:::info tl;dr\\n\\nPlaying around with a numbers in a _special_ base.\\n\\n:::\\n\\nGetting flashbacks to the _IB111 Foundations of Programming_\u2026 Very nice \u201cproblem\u201d\\nwith a rather easy solution, as the last day always seems to be.\\n\\n### Solution\\n\\nImplementing 2 functions, converting from the _SNAFU base_ and back to the _SNAFU_\\n_base_ representation. Let\'s do a bit more though! I have implemented two functions:\\n\\n- `from_snafu`\\n- `to_snafu`\\n\\nNow it is apparent that all I do is number to string and string to number. Hmm\u2026\\nthat sounds familiar, doesn\'t it? Let\'s introduce a structure for the SNAFU numbers\\nand implement the traits that we need.\\n\\nLet\'s start with a structure:\\n\\n```rust\\n#[derive(Debug, PartialEq, Eq, PartialOrd, Ord)]\\nstruct SNAFU {\\n value: i64,\\n}\\n```\\n\\n#### Converting from `&str`\\n\\nWe will start by implementing the `FromStr` trait that will help us parse our input.\\nThis is rather simple, I can just take the `from_snafu` function, copy-paste it\\ninto the `from_str` method and the number I get will be wrapped in `Result` and\\n`SNAFU` structure.\\n\\n#### Converting to `String`\\n\\nThis is more fun. In some cases you need to implement only one trait and others\\nare automatically implemented using that one trait. In our case, if you look in\\nthe documentation, you can see that `ToString` trait is automatically implemented\\nfor any type that implements `Display` trait.\\n\\nLet\'s implement the `Display` trait then. We should be able to use the `to_snafu`\\nfunction and just take the `self.value` from the `SNAFU` structure.\\n\\nAnd for the convenience of tests, we can also implement a rather simple `From`\\ntrait for the `SNAFU`.\\n\\n#### Adjusting the code\\n\\nAfter those changes we need to adjust the code and tests.\\n\\nParsing of the input is very easy, before we have used the lines, now we parse\\neverything:\\n\\n```diff\\n fn parse_input>(pathname: P) -> Input {\\n- file_to_lines(pathname)\\n+ file_to_structs(pathname)\\n }\\n```\\n\\nPart 1 needs to be adjusted a bit too:\\n\\n```diff\\n fn part_1(input: &Input) -> Output {\\n- to_snafu(input.iter().map(|s| from_snafu(s)).sum())\\n+ SNAFU::from(input.iter().map(|s| s.value).sum::()).to_string()\\n }\\n```\\n\\nYou can also see that it simplifies the meaning a bit and it is more explicit than\\nthe previous versions.\\n\\nAnd for the tests:\\n\\n```diff\\n #[test]\\n fn test_from() {\\n- for (n, s) in EXAMPLES.iter() {\\n- assert_eq!(from_snafu(s), *n);\\n+ for (&n, s) in EXAMPLES.iter() {\\n+ assert_eq!(s.parse::().unwrap().value, n);\\n }\\n }\\n\\n #[test]\\n fn test_to() {\\n- for (n, s) in EXAMPLES.iter() {\\n- assert_eq!(to_snafu(*n), s.to_string());\\n+ for (&n, s) in EXAMPLES.iter() {\\n+ assert_eq!(SNAFU::from(n).to_string(), s.to_string());\\n }\\n```\\n\\n## Summary\\n\\nLet\'s wrap the whole thing up! Keeping in mind both AoC and the Rust\u2026\\n\\n![Finished advent calendar :smile:](/img/blog/aoc-2022/04-week-4/calendar.png)\\n\\n### Advent of Code\\n\\nThis year was quite fun, even though most of the solutions and posts came in\\nlater on (_cough_ in \'23 _cough_). Day 22 was the most obnoxious one\u2026 And also\\nit feels like I used priority queues and tree data structures **a lot** :eyes:\\n\\n### with Rust\\n\\nI must admit that a lot of compiler warnings and errors were very useful. Even\\nthough I still found some instances where they didn\'t help at all or cause even\\nworse issues than I had. Compilation times have been addressed with the caching.\\n\\nBuilding my first tree data structure in Rust has been a very \u201cinteresting\u201d\\njourney. Being able to write a more generic BFS algorithm that allows you to not\\nduplicate code while still mantaining the desired functionality contributes to\\na very readable code.\\n\\nI am definitely much more aware of the basic things that bloated Python is\\nmissing, yet Rust has them\u2026\\n\\nUsing explicit types and writing down placeholder functions with `todo!()`\\nmacros is very pleasant, since it allows you to easily navigate the type system\\nduring the development when you don\'t even need to be sure how are you going to\\nput the smaller pieces together.\\n\\nI have used a plethora of traits and also implemented some of them to either be\\nidiomatic, or exploit the syntactic sugar they offer. Deriving the default trait\\nimplementation is also very helpful in a lot of cases, e.g. debugging output,\\ncopying, equality comparison, etc.\\n\\nI confess to touching more \u201ccursed\u201d parts of the Rust, such as macros to\\ndeclutter the copy-paste for tests or writing my own structures that need to\\ncarry a lifetime for their own fields.\\n\\ntl;dr Relatively pleasant language until you hit brick wall :wink:\\n\\n---\\n\\nSee you next year! Maybe in Rust, maybe not :upside_down_face:\\n\\n[_advent of code_]: https://adventofcode.com\\n[_a\\\\*_]: https://en.wikipedia.org/wiki/A*_search_algorithm\\n[`binaryheap`]: https://doc.rust-lang.org/std/collections/struct.BinaryHeap.html\\n[`reverse`]: https://doc.rust-lang.org/std/cmp/struct.Reverse.html\\n[docs of the `binaryheap`]: https://doc.rust-lang.org/std/collections/struct.BinaryHeap.html#min-heap"},{"id":"aoc-2022/3rd-week","metadata":{"permalink":"/blog/aoc-2022/3rd-week","editUrl":"https://github.com/mfocko/blog/tree/main/blog/aoc-2022/03-week-3.md","source":"@site/blog/aoc-2022/03-week-3.md","title":"3rd week of Advent of Code \'22 in Rust","description":"Surviving third week in Rust.","date":"2023-07-06T21:00:00.000Z","formattedDate":"July 6, 2023","tags":[{"label":"advent-of-code","permalink":"/blog/tags/advent-of-code"},{"label":"advent-of-code-2022","permalink":"/blog/tags/advent-of-code-2022"},{"label":"rust","permalink":"/blog/tags/rust"}],"readingTime":11.57,"hasTruncateMarker":true,"authors":[{"name":"Matej Focko","email":"me+blog@mfocko.xyz","title":"a.k.a. @mf","url":"https://gitlab.com/mfocko","imageURL":"https://github.com/mfocko.png","key":"mf"}],"frontMatter":{"title":"3rd week of Advent of Code \'22 in Rust","description":"Surviving third week in Rust.","date":"2023-07-06T21:00","slug":"aoc-2022/3rd-week","authors":"mf","tags":["advent-of-code","advent-of-code-2022","rust"],"hide_table_of_contents":false},"unlisted":false,"prevItem":{"title":"4th week of Advent of Code \'22 in Rust","permalink":"/blog/aoc-2022/4th-week"},"nextItem":{"title":"Sort the matrix diagonally","permalink":"/blog/leetcode/sort-diagonally"}},"content":"Let\'s go through the third week of [_Advent of Code_] in Rust.\\n\\n\x3c!--truncate--\x3e\\n\\n## [Day 15: Beacon Exclusion Zone](https://adventofcode.com/2022/day/15)\\n\\n:::info tl;dr\\n\\nTriangulating a distress beacon based on the information from the sensors.\\n\\n:::\\n\\n### Solution\\n\\nRelatively easy thing to implement, no major Rust issues hit.\\n\\n## [Day 16: Proboscidea Volcanium](https://adventofcode.com/2022/day/16)\\n\\n:::info tl;dr\\n\\nFinding a max flow in a graph given some time constraints.\\n\\n:::\\n\\n### Solution\\n\\nI have used some interesting things to implement this and make it easier for me.\\n\\n#### Indexing in graph\\n\\nI have come across a situation where I needed to keep more information regarding\\nthe graph\u2026 In that case you can, of course, create a structure and keep it in,\\nbut once you have multiple members in the structure it gets harder to work with\\nsince you need to address the fields in the structure. When you work with graph,\\nyou frequently need to access the vertices and in this case it felt a lot easier\\nto implement the indexing in a graph, rather than explicitly access the\\nunderlying data structure.\\n\\nHere you can see a rather short snippet from the solution that allows you to\\n\u201cindex\u201d the graph:\\n\\n```rust\\nimpl Index<&str> for Graph {\\n type Output = Vertex;\\n\\n fn index(&self, index: &str) -> &Self::Output {\\n &self.g[index]\\n }\\n}\\n```\\n\\n#### Cartesian product\\n\\nDuring the implementation I had to utilize Floyd-Warshall algorithm for finding\\nthe shortest path between pairs of vertices and utilized the `iproduct!` macro\\nfrom the [`itertools`]. It is a very useful higher-order function that allows\\nyou to keep the nesting of the loops at a minimum level while still maintaining\\nthe same functionality.\\n\\n#### \u201cImplementing\u201d an iterator\\n\\nFor the second part, you get to split the work between 2 actors. That way you\\ncan achieve higher efficiency of the whole process that you\'re planning, but it\\nalso makes it harder to evaluate algorithmically, since you need to check the\\ndifferent ways the work can be split.\\n\\nBeing affected by _functional programming brain damage_:tm:, I have chosen to\\ndo this part by function that returns an iterator over the possible ways:\\n\\n```rust\\nfn pairings(\\n valves: &BTreeSet,\\n) -> impl Iterator, BTreeSet)> + \'_ {\\n let mapping = valves.iter().collect_vec();\\n\\n let max_mask = 1 << (valves.len() - 1);\\n\\n (0..max_mask).map(move |mask| {\\n let mut elephant = BTreeSet::new();\\n let mut human = BTreeSet::new();\\n\\n for (i, &v) in mapping.iter().enumerate() {\\n if (mask & (1 << i)) == 0 {\\n human.insert(v.clone());\\n } else {\\n elephant.insert(v.clone());\\n }\\n }\\n\\n (human, elephant)\\n })\\n}\\n```\\n\\n## [Day 17: Pyroclastic Flow](https://adventofcode.com/2022/day/17)\\n\\n:::info tl;dr\\n\\nSimulating an autonomous Tetris where pieces get affected by a series of jets of\\nhot gas.\\n\\n:::\\n\\n### Solution\\n\\nSimilarly to the previous day I have created some iterators :smile:\\n\\n#### Collision detection\\n\\nOnce you need to check for collisions it is very helpful to be able to just\\niterate through the positions that can actually collide with the wall or other\\npiece.\\n\\nTo get the desired behaviour, you can just compose few smaller functions:\\n\\n```rust\\nfn occupied(shape: &[Vec]) -> impl Iterator + \'_ {\\n shape.iter().enumerate().flat_map(|(y, row)| {\\n row.iter().enumerate().filter_map(move |(x, c)| {\\n if c == &\'#\' {\\n Some(Vector2D::new(x as isize, y as isize))\\n } else {\\n None\\n }\\n })\\n })\\n}\\n```\\n\\nIn the end, we get relative positions which we can adjust later when given the\\nspecific positions from iterator. You can see some interesting parts in this:\\n\\n- `.enumerate()` allows us to get both the indices (coordinates) and the line\\n or, later on, the character itself,\\n- `.flat_map()` flattens the iterator, i.e. when we return another iterator,\\n they just get chained instead of iterating over iterators (which sounds pretty\\n disturbing, doesn\'t it?),\\n- and finally `.filter_map()` which is pretty similar to the \u201cbasic\u201d `.map()`\\n with a one, key, difference that it expects the items of an iterator to be\\n mapped to an `Option` from which it ignores nothing (as in `None` :wink:)\\n and also unwraps the values from `Some(\u2026)`.\\n\\n#### Infinite iterator\\n\\nIn the solution we cycle through both Tetris-like shapes that fall down and the\\njets that move our pieces around. Initially I have implemented my own infinite\\niterator that just yields the indices. It is a very simple, yet powerful, piece\\nof code:\\n\\n```rust\\nstruct InfiniteIndex {\\n size: usize,\\n i: usize,\\n}\\n\\nimpl InfiniteIndex {\\n fn new(size: usize) -> InfiniteIndex {\\n InfiniteIndex { size, i: size - 1 }\\n }\\n}\\n\\nimpl Iterator for InfiniteIndex {\\n type Item = usize;\\n\\n fn next(&mut self) -> Option {\\n self.i = (self.i + 1) % self.size;\\n Some(self.i)\\n }\\n}\\n```\\n\\nHowever when I\'m looking at the code now, it doesn\'t really make much sense\u2026\\nGuess what, we can use a built-in function that is implemented on iterators for\\nthat! The function is called `.cycle()`\\n\\nOn the other hand, I am not going to switch to that function, since it would\\nintroduce an another myriad of issues caused by the fact that I create iterators\\nright away in the constructor of my structure and the iterators would borrow\\nboth the jets and shapes which would introduce a lifetime dependency into the\\nstructure.\\n\\n## [Day 18: Boiling Boulders](https://adventofcode.com/2022/day/18)\\n\\n:::info tl;dr\\n\\nLet\'s compute a surface area of some obsidian approximated via coordinates of\\ncubes.\\n\\n:::\\n\\n### Solution\\n\\nThis day is kinda interesting, because it shows how easily you can complicate the\\nproblem and also how much can you screw yourself over with the optimization and\\n\u201csmart\u201d approach.\\n\\nFor the first part you need to find the surface area of an obsidian that is\\napproximated by cubes. Now, that is a very easy thing to do, just keep the track\\nof already added cubes, and check if the newly added cube touches any face of any\\nother cube. Simple, and with a `BTreeSet` relatively efficient way to do it.\\n\\nHowever the second part lets you on a secret that there may be some surface area\\nfrom the \u201cinside\u201d too and you want to know only the one from the outside of the\\nobsidian. I have seen some solutions later, but if you check your data, you might\\nnotice that the bounding box of all the cubes isn\'t that big at all. Therefore I\\nchose to pre-construct the box beforehand, fill in the cubes and then just run a\\nBFS turning all the lava on the outside into the air. Now you just need to check\\ncubes and count how many of their faces touch the air.\\n\\n## [Day 19: Not Enough Minerals](https://adventofcode.com/2022/day/19)\\n\\n:::info tl;dr\\n\\nFinding out the best strategy for building robots to collect geodes.\\n\\n:::\\n\\n### Solution\\n\\nNot much interesting stuff to mention apart from the suggestion to never believe\\nthat the default implementation given by `derive` macro is what you want, it\\ndoesn\'t have to be. :smile:\\n\\n## [Day 20: Grove Positioning System](https://adventofcode.com/2022/day/20)\\n\\n:::info tl;dr\\n\\nShuffling around the _circular linked list_ to find the coordinates.\\n\\n:::\\n\\nNow, small rant for this day is in place. They\'ve never mentioned that coordinates\\ncan repeat and therefore the values are non-unique. This is something that did\\nnot happen in the given sample, but was present in the user input. It took \xbba lot\xab\\nto realize that this is the issue.\\n\\n### Solution\\n\\nI have tried implementing a circular linked list for this\u2026 and I have failed\\nmiserably. To be fair, I still have no clue why. It was \u201cfun\u201d to play around with\\nthe `Rc>`. In the end I failed on _wrong answer_. I have also encountered\\na rather interesting issue with `.borrow_mut()` method being used on `Rc>`.\\n\\n#### `.borrow_mut()`\\n\\nConsider the following snippet of the code (taken from the documentation):\\n\\n```rust\\nuse std::cell::{RefCell, RefMut};\\nuse std::collections::HashMap;\\nuse std::rc::Rc;\\n// use std::borrow::BorrowMut;\\n\\nfn main() {\\n let shared_map: Rc> = Rc::new(RefCell::new(HashMap::new()));\\n // Create a new block to limit the scope of the dynamic borrow\\n {\\n let mut map: RefMut<_> = shared_map.borrow_mut();\\n map.insert(\\"africa\\", 92388);\\n map.insert(\\"kyoto\\", 11837);\\n map.insert(\\"piccadilly\\", 11826);\\n map.insert(\\"marbles\\", 38);\\n }\\n\\n // Note that if we had not let the previous borrow of the cache fall out\\n // of scope then the subsequent borrow would cause a dynamic thread panic.\\n // This is the major hazard of using `RefCell`.\\n let total: i32 = shared_map.borrow().values().sum();\\n println!(\\"{total}\\");\\n}\\n```\\n\\nWe allocate a hash map on the heap and then in the inner block, we borrow it as\\na mutable reference, so that we can use it.\\n\\n:::note\\n\\nIt is a very primitive example for `Rc>` and mutable borrow.\\n\\n:::\\n\\nIf you uncomment the 4th line with `use std::borrow::BorrowMut;`, you cannot\\ncompile the code anymore, because of\\n\\n```\\n Compiling playground v0.0.1 (/playground)\\nerror[E0308]: mismatched types\\n --\x3e src/main.rs:10:34\\n |\\n10 | let mut map: RefMut<_> = shared_map.borrow_mut();\\n | --------- ^^^^^^^^^^^^^^^^^^^^^^^ expected struct `RefMut`, found mutable reference\\n | |\\n | expected due to this\\n |\\n = note: expected struct `RefMut<\'_, _>`\\n found mutable reference `&mut Rc>>`\\n\\nerror[E0599]: no method named `insert` found for struct `RefMut<\'_, _>` in the current scope\\n --\x3e src/main.rs:11:13\\n |\\n11 | map.insert(\\"africa\\", 92388);\\n | ^^^^^^ method not found in `RefMut<\'_, _>`\\n\\nerror[E0599]: no method named `insert` found for struct `RefMut<\'_, _>` in the current scope\\n --\x3e src/main.rs:12:13\\n |\\n12 | map.insert(\\"kyoto\\", 11837);\\n | ^^^^^^ method not found in `RefMut<\'_, _>`\\n\\nerror[E0599]: no method named `insert` found for struct `RefMut<\'_, _>` in the current scope\\n --\x3e src/main.rs:13:13\\n |\\n13 | map.insert(\\"piccadilly\\", 11826);\\n | ^^^^^^ method not found in `RefMut<\'_, _>`\\n\\nerror[E0599]: no method named `insert` found for struct `RefMut<\'_, _>` in the current scope\\n --\x3e src/main.rs:14:13\\n |\\n14 | map.insert(\\"marbles\\", 38);\\n | ^^^^^^ method not found in `RefMut<\'_, _>`\\n\\nSome errors have detailed explanations: E0308, E0599.\\nFor more information about an error, try `rustc --explain E0308`.\\nerror: could not compile `playground` due to 5 previous errors\\n```\\n\\nIt might seem **a bit** ridiculous. However, I got to a point where the compiler\\nsuggested `use std::borrow::BorrowMut;` and it resulted in breaking parts of the\\ncode that worked previously. I think it may be a good idea to go over what is\\nhappening here.\\n\\n##### `.borrow_mut()` on `Rc>`\\n\\nLet\'s consider a variable `x` of type `Rc>`. What happens when you\\ncall `.borrow_mut()` on it? We can look at the `Rc` type, and\u2026 hang on! There is\\nneither `.borrow_mut()` method or `BorrowMut` trait implemented. How can we do it\\nthen?\\n\\nLet\'s go further and we can see that `RefCell` implements a `.borrow_mut()`\\nmethod. OK, but how can we call it on the `Rc`? Easily! `Rc` implements\\n`Deref` and therefore you can call methods on `Rc` objects as if they were\\n`T` objects. If we read on _`Deref` coercion_, we can see the following:\\n\\n> If `T` implements `Deref`, \u2026:\\n>\\n> - \u2026\\n> - `T` implicitly implements all the (immutable) methods of the type `U`.\\n\\nWhat is the requirement for the `.borrow_mut()` on `RefCell`? Well, it needs\\n`&self`, so the `Deref` implements the `.borrow_mut()` for the `Rc>`.\\n\\n##### `BorrowMut` trait\\n\\nI have not been able to find a lot on this trait. My guess is that it provides a\\nmethod instead of a syntactic sugar (`&mut x`) for the mutable borrow. And also\\nit provides default implementations for the types:\\n\\n```rust\\nimpl BorrowMut for String\\n\\nimpl BorrowMut for &mut T\\nwhere\\n T: ?Sized,\\n\\nimpl BorrowMut for T\\nwhere\\n T: ?Sized,\\n\\nimpl BorrowMut<[T]> for Vec\\nwhere\\n A: Allocator,\\n\\nimpl BorrowMut for Box\\nwhere\\n A: Allocator,\\n T: ?Sized,\\n\\nimpl BorrowMut<[T]> for [T; N]\\n```\\n\\n##### Conflict\\n\\nNow the question is why did it break the code\u2026 My first take was that the type\\n`Rc>` has some _specialized_ implementation of the `.borrow_mut()` and\\nthe `use` overrides it with the default, which is true **in a sense**. However\\nthere is no _specialized_ implementation. Let\'s have a look at the trait and the\\ntype signature on the `RefCell`:\\n\\n```rust\\n// trait\\npub trait BorrowMut: Borrow\\nwhere\\n Borrowed: ?Sized,\\n{\\n fn borrow_mut(&mut self) -> &mut Borrowed;\\n}\\n\\n// \u2039RefCell.borrow_mut()\u203a type signature\\npub fn borrow_mut(&self) -> RefMut<\'_, T>\\n```\\n\\nI think that we can definitely agree on the fact that `RefMut<\'_, T>` is not the\\n`RefCell`.\\n\\n**In my opinion**, `RefCell` implements a **separate** `.borrow_mut()` rather\\nthan implementing the interface, because it **cannot** satisfy the type requirements\\nof the trait.\\n\\n:::caution\\n\\nI wonder how are we expected to deal with this conflict, if and when, we need\\nboth the `.borrow_mut()` of the trait and `.borrow_mut()` of the `RefCell`.\\n\\n:::\\n\\n:::tip Fun fact\\n\\nI was suggested by the compiler to do `use std::borrow::BorrowMut;` and break the\\ncode.\\n\\nSo much for the _almighty_ and _helpful_ compiler\u2026\\n\\n:::\\n\\n## [Day 21: Monkey Math](https://adventofcode.com/2022/day/21)\\n\\n:::info tl;dr\\n\\nComputing an expression tree and then also finding ideal value for a node.\\n\\n:::\\n\\n### Solution\\n\\nRelatively simple, until you get to the 2nd part where you start to practice\\na lot of the copy-paste. I have managed to sneak some perverted stuff in there\\nthough :) Let\'s go through the details.\\n\\n#### `Default` trait\\n\\nFor the first time and twice I had a need to have a default value for my types,\\nenumerations in this case. Rust offers a very nice trait[^1] that is described\\nas:\\n\\n> A trait for giving a type a useful default value.\\n\\nI guess it sums it up nicely. The more interesting part about this is the fact\\nthat you can use the _macro machinery_ to save yourself some typing. If you have\\nenumeration of which the default value doesn\'t bear any parameter, you can just\\ndo[^2]:\\n\\n```rust\\n#[derive(Default)]\\nenum Color {\\n #[default]\\n White,\\n Gray,\\n Black,\\n}\\n```\\n\\n#### Abusing negation\\n\\nIf you want to use a _unary minus_ operator on your own type, you can implement\\na `Neg` trait[^3]. I was dealing with a binary tree and needed a way how to look\\nat the other side, so I have just implemented the negation for flipping between\\nleft and right :smile:\\n\\n[^1]: [`Default`](https://doc.rust-lang.org/std/default/trait.Default.html) docs\\n[^2]: Pardon my example from the graph algorithms ;)\\n[^3]: [`Neg`](https://doc.rust-lang.org/std/ops/trait.Neg.html) docs\\n\\n[_advent of code_]: https://adventofcode.com\\n[`itertools`]: https://crates.io/crates/itertools\\n[this reddit post and the comment]: https://www.reddit.com/r/adventofcode/comments/zb98pn/comment/iyq0ono"},{"id":"leetcode/sort-diagonally","metadata":{"permalink":"/blog/leetcode/sort-diagonally","editUrl":"https://github.com/mfocko/blog/tree/main/blog/leetcode/sort-matrix-diagonally.md","source":"@site/blog/leetcode/sort-matrix-diagonally.md","title":"Sort the matrix diagonally","description":"Compiler assisted development.","date":"2023-03-04T23:15:00.000Z","formattedDate":"March 4, 2023","tags":[{"label":"cpp","permalink":"/blog/tags/cpp"},{"label":"leetcode","permalink":"/blog/tags/leetcode"},{"label":"iterators","permalink":"/blog/tags/iterators"}],"readingTime":16.99,"hasTruncateMarker":true,"authors":[{"name":"Matej Focko","email":"me+blog@mfocko.xyz","title":"a.k.a. @mf","url":"https://gitlab.com/mfocko","imageURL":"https://github.com/mfocko.png","key":"mf"}],"frontMatter":{"title":"Sort the matrix diagonally","description":"Compiler assisted development.","date":"2023-03-04T23:15","slug":"leetcode/sort-diagonally","authors":"mf","tags":["cpp","leetcode","iterators"],"hide_table_of_contents":false},"unlisted":false,"prevItem":{"title":"3rd week of Advent of Code \'22 in Rust","permalink":"/blog/aoc-2022/3rd-week"},"nextItem":{"title":"2nd week of Advent of Code \'22 in Rust","permalink":"/blog/aoc-2022/2nd-week"}},"content":"Let\'s try to solve one of the LeetCode challenges in easy and hard mode at the\\nsame time.\\n\\n\x3c!--truncate--\x3e\\n\\n- Link to the problem: https://leetcode.com/problems/sort-the-matrix-diagonally/\\n\\n## Problem description\\n\\nA **matrix diagonal** is a diagonal line of cells starting from some cell in\\neither the topmost row or leftmost column and going in the bottom-right direction\\nuntil reaching the matrix\'s end. For example, the **matrix diagonal** starting\\nfrom `mat[2][0]`, where `mat` is a `6 x 3` matrix, includes cells `mat[2][0]`,\\n`mat[3][1]`, and `mat[4][2]`.\\n\\nGiven an `m x n` matrix `mat` of integers, sort each matrix diagonal in ascending\\norder and return the resulting matrix.\\n\\n### Example\\n\\n![Image describing the problem](https://assets.leetcode.com/uploads/2020/01/21/1482_example_1_2.png)\\n\\n## Skeleton and initial adjustments\\n\\nWe are given the following skeleton for the C++ and the given challenge:\\n\\n```cpp\\nclass Solution {\\npublic:\\n vector> diagonalSort(vector>& mat) {\\n\\n }\\n};\\n```\\n\\nThe task is to sort the passed matrix diagonally and then return it. First of all,\\nI don\'t like to solve this in a web browser, so we\'ll need to adjust it accordingly\\nfor running it locally. We\'ll start by including the `vector` header and using\\nfully-qualified namespaces[^1] and also adding few tests:\\n\\n```cpp\\n#include \\n#include \\n\\nusing matrix = std::vector>;\\n\\nclass Solution {\\npublic:\\n matrix diagonalSort(matrix& mat)\\n {\\n }\\n};\\n\\nstatic void test_case_1()\\n{\\n // Input: mat = [[3,3,1,1],[2,2,1,2],[1,1,1,2]]\\n // Output: [[1,1,1,1],[1,2,2,2],[1,2,3,3]]\\n\\n Solution s;\\n assert((s.diagonalSort(std::vector { std::vector { 3, 3, 1, 1 },\\n std::vector { 2, 2, 1, 2 },\\n std::vector { 1, 1, 1, 2 } })\\n == std::vector { std::vector { 1, 1, 1, 1 },\\n std::vector { 1, 2, 2, 2 },\\n std::vector { 1, 2, 3, 3 } }));\\n}\\n\\nstatic void test_case_2()\\n{\\n // Input: mat =\\n // [[11,25,66,1,69,7],[23,55,17,45,15,52],[75,31,36,44,58,8],[22,27,33,25,68,4],[84,28,14,11,5,50]]\\n // Output:\\n // [[5,17,4,1,52,7],[11,11,25,45,8,69],[14,23,25,44,58,15],[22,27,31,36,50,66],[84,28,75,33,55,68]]\\n\\n Solution s;\\n assert((s.diagonalSort(std::vector { std::vector { 11, 25, 66, 1, 69, 7 },\\n std::vector { 23, 55, 17, 45, 15, 52 },\\n std::vector { 75, 31, 36, 44, 58, 8 },\\n std::vector { 22, 27, 33, 25, 68, 4 },\\n std::vector { 84, 28, 14, 11, 5, 50 } })\\n == std::vector { std::vector { 5, 17, 4, 1, 52, 7 },\\n std::vector { 11, 11, 25, 45, 8, 69 },\\n std::vector { 14, 23, 25, 44, 58, 15 },\\n std::vector { 22, 27, 31, 36, 50, 66 },\\n std::vector { 84, 28, 75, 33, 55, 68 } }));\\n}\\n\\nint main()\\n{\\n test_case_1();\\n test_case_2();\\n\\n return 0;\\n}\\n```\\n\\nWe need to return the matrix, but we\'re given a reference to the input matrix. We\\ncan easily abuse the C++ here and just switch the reference to value, this way\\nthe matrix will be copied when passed to the function, we can sort the copy and\\njust return it back. And we also get yelled by the compiler for the fact that the\\nmethod doesn\'t return anything yet, so to make it \u201cshut up\u201d we will just return\\nthe input for now:\\n\\n```diff\\n- matrix diagonalSort(matrix& mat)\\n+ matrix diagonalSort(matrix mat)\\n {\\n+ return mat;\\n }\\n```\\n\\nNow, we get the copy and we\'re good to go.\\n\\n## Na\xefve solution\\n\\nAs you may know, C++ offers a plethora of functions that can be used to your\\nadvantage, given that you know how to \u201cbend\u201d the data structures accordingly.\\n\\nWhat does that mean for us? Well, we have an `std::sort`, we can use it, right?\\nLet\'s have a look at it:\\n\\n```cpp\\ntemplate< class RandomIt >\\nvoid sort( RandomIt first, RandomIt last );\\n```\\n\\nThis overload is more than we need. What does it do? It just sorts the elements\\nin the range `[first, last)` using `operator<` on them. We can\'t sort the whole\\nmatrix using this, but\u2026 we can sort just \xbbone\xab diagonal without doing much work\\non our end.\\n\\nWhat is the `RandomIt` type though? If we look more into the documentation, we\\ncan easily find the requirements for it and also learn that it\'s a _random access_\\n_iterator_ and allows swapping its values at the same time.\\n\\n:::tip Random access iterator\\n\\nWhat is the _random access iterator_ though? We can find it in a documentation\\nand see the following description:\\n\\n> A **LegacyRandomAccessIterator** is a [LegacyBidirectionalIterator](https://en.cppreference.com/w/cpp/named_req/BidirectionalIterator)\\n> that can be moved to point to any element in constant time.\\n\\nAfter that we can see all the requirements for it being listed. I don\'t feel like\\nreading them right now, so we will just use it and see where the compilation blows\\nup, i.e. \u201c_compiler-assisted development_\u201d[^2] if you will ;)\\n\\n:::\\n\\nNow we know that we can use `std::sort` to sort the diagonal itself, but we also\\nneed to get the diagonals somehow. I\'m rather lazy, so I\'ll just delegate it to\\nsomeone else[^3]. And that way we get\\n\\n```cpp\\nmatrix diagonalSort(matrix mat)\\n{\\n // we iterate over the diagonals\\n for (auto d : diagonals(mat)) {\\n // and we sort each diagonal\\n std::sort(d.begin(), d.end());\\n }\\n\\n // we take the matrix by copy, so we can sort in-situ and return the copy\\n // that we sorted\\n return mat;\\n}\\n```\\n\\nThis solution looks very simple, doesn\'t it? Well, cause it is.\\nLet\'s try compiling it:\\n\\n```\\nmatrix-sort.cpp:11:23: error: use of undeclared identifier \'diagonals\' [clang-diagnostic-error]\\n for (auto d : diagonals(mat)) {\\n ^\\nFound compiler error(s).\\nmake: *** [makefile:14: tidy] Error 1\\n```\\n\\nOK, seems about right. We haven\'t implemented the `diagonals` yet. And based on\\nwhat we\'ve written so far, we need a function or a class `diagonals` that will\\ngive us the diagonals we need.\\n\\n## Implementing the `diagonals`\\n\\nCool, so we need the function that will let us go through each and every diagonal\\nin our matrix. We use the _for-range_ loop, so whatever we get back from the\\n`diagonals` must support `.begin()` and `.end()`. Since I am a masochist, we will\\ndo such functionality for a matrix of any type, not just the `int` from the challenge.\\n\\nAs I said, we need to be able to\\n\\n- construct the object\\n- get the beginning\\n- get the end (the \u201csentinel\u201d)\\n\\n```cpp\\ntemplate \\nclass diagonals {\\n using matrix_t = std::vector>;\\n\\n matrix_t& _matrix;\\n\\npublic:\\n diagonals(matrix_t& m)\\n : _matrix(m)\\n {\\n }\\n diagonals_iter begin()\\n {\\n /* TODO */\\n }\\n diagonals_iter end()\\n {\\n /* TODO */\\n }\\n};\\n```\\n\\nNow we have a `diagonals` that we can use to go through the diagonals. We haven\'t\\nimplemented the core of it yet. Let\'s go through what we have for now.\\n\\nWe have a templated class with templated `T` that is used as a placeholder for any\\ntype we would store in the matrix. Because I\'m lazy, I have defined the `matrix_t`\\ntype that is a \u201cshortcut\u201d for `std::vector>`, so I don\'t have to\\ntype it out all the time. Of course, we need to store the matrix, we are given,\\nas a private attribute. And then just have the constructor and the 2 methods we\\nneed for the _for-range_.\\n\\n### Iterating over diagonals\\n\\nNow that we have an object that will allow us to iterate through the diagonals,\\nwe need to implement the iterating itself. We need to go through all of them, so\\nwe have multiple options how to do so. I have decided to start from the \u201cmain\u201d\\ndiagonal that starts at `(0, 0)` index and then proceed with the diagonals starting\\nin the first row, followed by the rest of the diagonals in the first column.\\n\\nWe need to be able to tell that we\'ve iterated through all of them, and also we\\nneed to know which diagonal is next. For that purpose we will pass the indices\\nof the first cell on the diagonal. That way we can always tell how to move forward.\\n\\nWe will start by updating the `begin` and `end` to reflect our choice accordingly.\\n\\n```cpp\\ndiagonals_iter begin() { return diagonals_iter { _matrix, 0, 0 }; }\\ndiagonals_iter end() { return diagonals_iter { _matrix, 0, _matrix.size() }; }\\n```\\n\\nFor the `begin` we return the first diagonal that starts at `(0, 0)`. And because\\nwe have decided to do the diagonals in the first column at the end, the first\\ndiagonal that is not a valid one is the one at `(0, height)`. Apart from the\\nindices, we also need to pass reference to the matrix itself.\\n\\n:::note\\n\\nYou may have noticed that we also include the diagonals that have length 1,\\nspecifically the ones at `(0, height - 1)` and `(width - 1, 0)`. We are implementing\\nan iterator that **should not** care about the way it\'s being used. Therefore, we\\ndon\'t care about the fact they don\'t need to be sorted.\\n\\n:::\\n\\nCool, let\'s leave the iterator itself to someone else, right?[^4]\\n\\n### Implementing the iterator over diagonals\\n\\nWe can start with a simple skeleton based on the information that we pass from\\nthe `diagonals`. Also to utilize the `matrix_t` and also contain implementation\\ndetails hidden away, we will put this code into the `diagonals` class.\\n\\n```cpp\\nclass diagonals_iter {\\n matrix_t& m;\\n std::size_t x;\\n std::size_t y;\\n\\npublic:\\n diagonals_iter(matrix_t& matrix, std::size_t x, std::size_t y)\\n : m(matrix)\\n , x(x)\\n , y(y)\\n {\\n }\\n};\\n```\\n\\nIn this case we will be implementing a \u201csimple\u201d forward iterator, so we don\'t\\nneed to implement a lot. Notably it will be:\\n\\n- inequality operator (we need to know when we reach the end and have nothing to\\n iterate over)\\n- preincrementation operator (we need to be able to move around the iterable)\\n- dereference operator (we need to be able to retrieve the objects we iterate\\n over)\\n\\n```cpp\\nclass diagonals_iter {\\n matrix_t& m;\\n std::size_t x;\\n std::size_t y;\\n\\npublic:\\n diagonals_iter(matrix_t& matrix, std::size_t x, std::size_t y)\\n : m(matrix)\\n , x(x)\\n , y(y)\\n {\\n }\\n\\n bool operator!=(const diagonals_iter& rhs) const\\n {\\n // iterators are not equal if they reference different matrices, or\\n // their positions differ\\n return m != rhs.m || x != rhs.x || y != rhs.y;\\n }\\n\\n diagonals_iter& operator++()\\n {\\n if (y != 0) {\\n // iterating through diagonals down the first column\\n y++;\\n return *this;\\n }\\n\\n // iterating the diagonals along the first row\\n x++;\\n if (x == m.front().size()) {\\n // switching to diagonals in the first column\\n x = 0;\\n y++;\\n }\\n\\n return *this;\\n }\\n\\n diagonal operator*() const { return diagonal { m, x, y }; }\\n};\\n```\\n\\nLet\'s go one-by-one. Inequality operator is rather simple, just compare iterator\'s\\nattributes field-by-field. If you think about it, checking inequality of two 2D\\nvectors may be a bit inefficient, therefore, we can swap around and check it as\\na last thing.\\n\\n```diff\\n- return m != rhs.m || x != rhs.x || y != rhs.y;\\n+ return x != rhs.x || y != rhs.y || m != rhs.m;\\n```\\n\\nPreincrementation is where the magic happens. If you have a better look, you can\\nsee two branches of this operation:\\n\\n1. When `y != 0` (we\'re iterating over the diagonals in the first column)\\n In this case, we just bump the row and we\'re done.\\n2. When `y == 0` (we\'re iterating over the diagonals in the first row)\\n In this case, we bump the column and check if we haven\'t gotten out of bounds,\\n i.e. the end of the first row. If we get out of the bounds, we\'re continuing\\n with the second diagonal in the first column.\\n\\nDereferencing the iterator must \u201cyield\u201d something. In our case it will be the\\ndiagonal that we want to sort. For sorting we need just the iterators that can\\nmove around said diagonal. The simplest thing, we can do, is to delegate it to\\nsomething else. In our case it will be a class called `diagonal`.\\n\\n## Implementing the `diagonal` itself\\n\\nAfter implementing the iterator over diagonals, we know that all we need to describe\\na diagonal is the matrix itself and the \u201cstart\u201d of the diagonal (row and column).\\nAnd we also know that the diagonal must provide some iterators for the `std::sort`\\nfunction. We can start with the following skeleton:\\n\\n```cpp\\ntemplate \\nclass diagonal {\\n using matrix_t = std::vector>;\\n\\n matrix_t& matrix;\\n std::size_t x;\\n std::size_t y;\\n\\npublic:\\n diagonal(matrix_t& matrix, std::size_t x, std::size_t y)\\n : matrix(matrix)\\n , x(x)\\n , y(y)\\n {\\n }\\n\\n diagonal_iter begin() const { return diagonal_iter { matrix, x, y }; }\\n\\n diagonal_iter end() const\\n {\\n auto max_x = matrix[y].size();\\n auto max_y = matrix.size();\\n\\n // we need to find the distance in which we get out of bounds (either in\\n // column or row)\\n auto steps = std::min(max_x - x, max_y - y);\\n\\n return diagonal_iter { matrix, x + steps, y + steps };\\n }\\n};\\n```\\n\\nInitialization is rather simple, we just \u201ckeep\u201d the stuff we get, `begin` is the\\nsimplest, we just delegate.\\n\\nIn case of the `end`, it gets more complicated. We need to know where is the \u201cend\u201d\\nof the diagonal. Since `end` should point to the first element \u201cafter\u201d the iterable,\\nwe know that it\'s the first position of the iterator where either `y` becomes\\n`matrix.size()` or `x` becomes `matrix[y].size()`. Also we are moving along diagonal,\\nduh, therefore we can deduce the first \u201cposition\u201d afterwards by minimal amount of\\nsteps to get out of the any column or row, hence `std::min(max_x - x, max_y - y)`.\\nFinal position is then computed simply by adding the steps to the beginning of\\nthe diagonal.\\n\\nNow we just need to finish the iterator for the diagonal itself and we\'re done.\\n\\n### Implementing `diagonal_iter`\\n\\nThis part is the hardest from all we need to do. It\'s because of the requirements\\nof the `std::sort` that requires us to implement a _random access iterator_. I have\\nbriefly described it above, and \u201cin a nutshell\u201d it means that we need to implement\\nan iterator that can move in constant time along the diagonal in any amount of\\nsteps.\\n\\nLet\'s go through all of the functionality that our iterator needs to support to\\nbe used in `std::sort`. We need the usual operations like:\\n\\n- equality/inequality\\n- incrementation\\n- dereferencing\\n\\nWe will also add all the types that our iterator uses with the category of the\\niterator, i.e. what interface it supports:\\n\\n```cpp\\nclass diagonal_iter {\\n // we need to keep reference to the matrix itself\\n matrix_t& m;\\n\\n // we need to be able to tell our current position\\n std::size_t x;\\n std::size_t y;\\n\\npublic:\\n using difference_type = std::ptrdiff_t;\\n using value_type = T;\\n using pointer = T*;\\n using reference = T&;\\n using iterator_category = std::random_access_iterator_tag;\\n\\n diagonal_iter(matrix_t& matrix,\\n std::size_t x,\\n std::size_t y)\\n : m(matrix)\\n , x(x)\\n , y(y)\\n {\\n }\\n\\n bool operator==(const diagonal_iter& rhs) const\\n {\\n return x == rhs.x && y == rhs.y && m == rhs.m;\\n }\\n\\n diagonal_iter& operator++()\\n {\\n // we are moving along the diagonal, so we increment both \u2039x\u203a and \u2039y\u203a at\\n // the same time\\n x++;\\n y++;\\n return *this;\\n }\\n\\n reference operator*() const { return m[y][x]; }\\n};\\n```\\n\\nThis is pretty similar to the previous iterator, but now we need to implement the\\nremaining requirements of the _random access iterator_. Let\'s see what those are:\\n\\n- decrementation - cause we need to be able to move backwards too, since _random _\\n _access iterator_ extends the interface of _bidirectional iterator_\\n- moving the iterator in either direction by steps given as an integer\\n- being able to tell the distance between two iterators\\n- define an ordering on the iterators\\n\\nLet\'s fill them in:\\n\\n```cpp\\nclass diagonal_iter {\\n // we need to keep reference to the matrix itself\\n matrix_t& m;\\n\\n // we need to be able to tell our current position\\n std::size_t x;\\n std::size_t y;\\n\\npublic:\\n using difference_type = std::ptrdiff_t;\\n using value_type = T;\\n using pointer = T*;\\n using reference = T&;\\n using iterator_category = std::random_access_iterator_tag;\\n\\n diagonal_iter(matrix_t& matrix,\\n std::size_t x,\\n std::size_t y)\\n : m(matrix)\\n , x(x)\\n , y(y)\\n {\\n }\\n\\n bool operator==(const diagonal_iter& rhs) const\\n {\\n return x == rhs.x && y == rhs.y && m == rhs.m;\\n }\\n\\n diagonal_iter& operator++()\\n {\\n // we are moving along the diagonal, so we increment both \u2039x\u203a and \u2039y\u203a at\\n // the same time\\n x++;\\n y++;\\n return *this;\\n }\\n\\n reference operator*() const { return m[y][x]; }\\n\\n // exactly opposite to the incrementation\\n diagonal_iter operator--()\\n {\\n x--;\\n y--;\\n return *this;\\n }\\n\\n // moving \u2039n\u203a steps back is same as calling decrementation \u2039n\u203a-times, so we\\n // can just return a new iterator and subtract \u2039n\u203a from both coordinates in\\n // the matrix\\n diagonal_iter operator-(difference_type n) const\\n {\\n return diagonal_iter { m, x - n, y - n };\\n }\\n\\n // here we assume that we are given two iterators on the same diagonal\\n difference_type operator-(const diagonal_iter& rhs) const\\n {\\n assert(m == rhs.m);\\n return x - rhs.x;\\n }\\n\\n // counterpart of moving \u2039n\u203a steps backwards\\n diagonal_iter operator+(difference_type n) const\\n {\\n return diagonal_iter { m, x + n, y + n };\\n }\\n\\n // we compare the coordinates, and also assume that those 2 iterators are\\n // lying on the same diagonal\\n bool operator<(const diagonal_iter& rhs) const\\n {\\n assert(m == rhs.m);\\n return x < rhs.x && y < rhs.y;\\n }\\n};\\n```\\n\\nAt this point we could probably try and compile it, right? If we do so, we will\\nget yelled at by a compiler for the following reasons:\\n\\n```\\n/usr/bin/../lib/gcc/x86_64-redhat-linux/12/../../../../include/c++/12/bits/stl_algo.h:1792:11: error: object of type \'diagonal::diagonal_iter\' cannot be assigned because its copy assignment operator is implicitly deleted [clang-diagnostic-error]\\n __last = __next;\\n ^\\n/usr/bin/../lib/gcc/x86_64-redhat-linux/12/../../../../include/c++/12/bits/stl_algo.h:1817:11: note: in instantiation of function template specialization \'std::__unguarded_linear_insert::diagonal_iter, __gnu_cxx::__ops::_Val_less_iter>\' requested here\\n std::__unguarded_linear_insert(__i,\\n ^\\n/usr/bin/../lib/gcc/x86_64-redhat-linux/12/../../../../include/c++/12/bits/stl_algo.h:1849:9: note: in instantiation of function template specialization \'std::__insertion_sort::diagonal_iter, __gnu_cxx::__ops::_Iter_less_iter>\' requested here\\n std::__insertion_sort(__first, __first + int(_S_threshold), __comp);\\n ^\\n/usr/bin/../lib/gcc/x86_64-redhat-linux/12/../../../../include/c++/12/bits/stl_algo.h:1940:9: note: in instantiation of function template specialization \'std::__final_insertion_sort::diagonal_iter, __gnu_cxx::__ops::_Iter_less_iter>\' requested here\\n std::__final_insertion_sort(__first, __last, __comp);\\n ^\\n/usr/bin/../lib/gcc/x86_64-redhat-linux/12/../../../../include/c++/12/bits/stl_algo.h:4820:12: note: in instantiation of function template specialization \'std::__sort::diagonal_iter, __gnu_cxx::__ops::_Iter_less_iter>\' requested here\\n std::__sort(__first, __last, __gnu_cxx::__ops::__iter_less_iter());\\n ^\\nmatrix-sort.cpp:161:18: note: in instantiation of function template specialization \'std::sort::diagonal_iter>\' requested here\\n std::sort(d.begin(), d.end());\\n ^\\nmatrix-sort.cpp:17:19: note: copy assignment operator of \'diagonal_iter\' is implicitly deleted because field \'m\' is of reference type \'diagonal::matrix_t &\' (aka \'vector> &\')\\n matrix_t& m;\\n ^\\n/usr/bin/../lib/gcc/x86_64-redhat-linux/12/../../../../include/c++/12/bits/stl_algo.h:1830:2: error: no matching function for call to \'__unguarded_linear_insert\' [clang-diagnostic-error]\\n std::__unguarded_linear_insert(__i,\\n ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\\n/usr/bin/../lib/gcc/x86_64-redhat-linux/12/../../../../include/c++/12/bits/stl_algo.h:1850:9: note: in instantiation of function template specialization \'std::__unguarded_insertion_sort::diagonal_iter, __gnu_cxx::__ops::_Iter_less_iter>\' requested here\\n std::__unguarded_insertion_sort(__first + int(_S_threshold), __last,\\n ^\\n/usr/bin/../lib/gcc/x86_64-redhat-linux/12/../../../../include/c++/12/bits/stl_algo.h:1940:9: note: in instantiation of function template specialization \'std::__final_insertion_sort::diagonal_iter, __gnu_cxx::__ops::_Iter_less_iter>\' requested here\\n std::__final_insertion_sort(__first, __last, __comp);\\n ^\\n/usr/bin/../lib/gcc/x86_64-redhat-linux/12/../../../../include/c++/12/bits/stl_algo.h:4820:12: note: in instantiation of function template specialization \'std::__sort::diagonal_iter, __gnu_cxx::__ops::_Iter_less_iter>\' requested here\\n std::__sort(__first, __last, __gnu_cxx::__ops::__iter_less_iter());\\n ^\\nmatrix-sort.cpp:161:18: note: in instantiation of function template specialization \'std::sort::diagonal_iter>\' requested here\\n std::sort(d.begin(), d.end());\\n ^\\n/usr/bin/../lib/gcc/x86_64-redhat-linux/12/../../../../include/c++/12/bits/stl_algo.h:1782:5: note: candidate template ignored: substitution failure [with _RandomAccessIterator = diagonal::diagonal_iter, _Compare = __gnu_cxx::__ops::_Val_less_iter]\\n __unguarded_linear_insert(_RandomAccessIterator __last,\\n ^\\n/usr/bin/../lib/gcc/x86_64-redhat-linux/12/../../../../include/c++/12/bits/stl_algo.h:1923:11: error: object of type \'diagonal::diagonal_iter\' cannot be assigned because its copy assignment operator is implicitly deleted [clang-diagnostic-error]\\n __last = __cut;\\n ^\\n/usr/bin/../lib/gcc/x86_64-redhat-linux/12/../../../../include/c++/12/bits/stl_algo.h:1937:9: note: in instantiation of function template specialization \'std::__introsort_loop::diagonal_iter, long, __gnu_cxx::__ops::_Iter_less_iter>\' requested here\\n std::__introsort_loop(__first, __last,\\n ^\\n/usr/bin/../lib/gcc/x86_64-redhat-linux/12/../../../../include/c++/12/bits/stl_algo.h:4820:12: note: in instantiation of function template specialization \'std::__sort::diagonal_iter, __gnu_cxx::__ops::_Iter_less_iter>\' requested here\\n std::__sort(__first, __last, __gnu_cxx::__ops::__iter_less_iter());\\n ^\\nmatrix-sort.cpp:161:18: note: in instantiation of function template specialization \'std::sort::diagonal_iter>\' requested here\\n std::sort(d.begin(), d.end());\\n ^\\nmatrix-sort.cpp:17:19: note: copy assignment operator of \'diagonal_iter\' is implicitly deleted because field \'m\' is of reference type \'diagonal::matrix_t &\' (aka \'vector> &\')\\n matrix_t& m;\\n ^\\n```\\n\\nThat\'s a lot of noise, isn\'t it? Let\'s focus on the important parts:\\n\\n```\\n/usr/bin/../lib/gcc/x86_64-redhat-linux/12/../../../../include/c++/12/bits/stl_algo.h:1792:11: error: object of type \'diagonal::diagonal_iter\' cannot be assigned because its copy assignment operator is implicitly deleted [clang-diagnostic-error]\\n\u2026\\nmatrix-sort.cpp:17:19: note: copy assignment operator of \'diagonal_iter\' is implicitly deleted because field \'m\' is of reference type \'diagonal::matrix_t &\' (aka \'vector> &\')\\n matrix_t& m;\\n ^\\n```\\n\\nAh! We have a reference in our iterator, and this prevents us from having a copy\\nassignment operator (that is used \u201csomewhere\u201d in the sorting algorithm). Well\u2026\\nLet\'s just wrap it!\\n\\n```diff\\n# we need to keep a different type than reference\\n- matrix_t& m;\\n+ std::reference_wrapper m;\\n\\n# in comparison we need to get the reference out of the wrapper first\\n- return x == rhs.x && y == rhs.y && m == rhs.m;\\n+ return x == rhs.x && y == rhs.y && m.get() == rhs.m.get();\\n\\n# same when we return a reference to the \u201ccell\u201d in the matrix\\n- reference operator*() const { return m[y][x]; }\\n+ reference operator*() const { return m.get()[y][x]; }\\n\\n# and finally in the assertions that we set for the \u201cdistance\u201d and \u201cless than\u201d\\n- assert(m == rhs.m);\\n+ assert(m.get() == rhs.m.get());\\n```\\n\\nWe\'re done now! We have written an iterator over diagonals for a 2D `vector`. You can have a look at the final result [here](pathname:///files/blog/leetcode/sort-matrix-diagonally/matrix-sort.cpp).\\n\\n[^1]: just because I\'m used to it and don\'t care about your opinion ;)\\n[^2]: exercise at your own risk\\n[^3]: me in 5 minutes in fact, but don\'t make me scared\\n[^4]: me in the next section\u2026"},{"id":"aoc-2022/2nd-week","metadata":{"permalink":"/blog/aoc-2022/2nd-week","editUrl":"https://github.com/mfocko/blog/tree/main/blog/aoc-2022/02-week-2.md","source":"@site/blog/aoc-2022/02-week-2.md","title":"2nd week of Advent of Code \'22 in Rust","description":"Surviving second week in Rust.","date":"2022-12-25T23:15:00.000Z","formattedDate":"December 25, 2022","tags":[{"label":"advent-of-code","permalink":"/blog/tags/advent-of-code"},{"label":"advent-of-code-2022","permalink":"/blog/tags/advent-of-code-2022"},{"label":"rust","permalink":"/blog/tags/rust"}],"readingTime":20.875,"hasTruncateMarker":true,"authors":[{"name":"Matej Focko","email":"me+blog@mfocko.xyz","title":"a.k.a. @mf","url":"https://gitlab.com/mfocko","imageURL":"https://github.com/mfocko.png","key":"mf"}],"frontMatter":{"title":"2nd week of Advent of Code \'22 in Rust","description":"Surviving second week in Rust.","date":"2022-12-25T23:15","slug":"aoc-2022/2nd-week","authors":"mf","tags":["advent-of-code","advent-of-code-2022","rust"],"hide_table_of_contents":false},"unlisted":false,"prevItem":{"title":"Sort the matrix diagonally","permalink":"/blog/leetcode/sort-diagonally"},"nextItem":{"title":"1st week of Advent of Code \'22 in Rust","permalink":"/blog/aoc-2022/1st-week"}},"content":"Let\'s go through the second week of [_Advent of Code_] in Rust.\\n\\n\x3c!--truncate--\x3e\\n\\n## [Day 8: Treetop Tree House](https://adventofcode.com/2022/day/8)\\n\\n:::info tl;dr\\n\\nWe get a forest and we want to know how many trees are visible from the outside.\\nApart from that we want to find the best view.\\n\\n:::\\n\\nNothing interesting. We are moving around 2D map though. And indexing can get a\\nbit painful when doing so, let\'s refactor it a bit ;) During the preparation for\\nthe AoC, I have written `Vector2D` and now it\'s time to extend it with indexing\\nof `Vec` of `Vec`s. In my solution I was manipulating with indices in the following\\nway:\\n\\n- swapping them\\n- checking whether they are correct indices for the `Vec>`\\n- indexing `Vec>` with them\\n\\n:::caution\\n\\nI\'m getting familiar with Rust and starting to \u201cabuse\u201d it\u2026 While doing so, I\'m\\nalso uncovering some \u201cfeatures\u201d that I don\'t really like. Therefore I will mark\\nall of my rants with _thicc_ **\xab\u21af\xbb** mark and will try to \u201clock\u201d them into their\\nown \u201cbox of hell\u201d.\\n\\n:::\\n\\n#### Swapping indices\\n\\nRelatively simple implementation, just take the values, swap them and return new\\nvector.\\n\\n```rust\\nimpl Vector2D {\\n pub fn swap(&self) -> Self {\\n Self {\\n x: self.y,\\n y: self.x,\\n }\\n }\\n}\\n```\\n\\nPretty straight-forward implementation, but let\'s talk about the `T: Copy`. We\\nneed to use it, since we are returning a **new** vector, with swapped **values**.\\nIf we had values that cannot be copied, the only thing we could do, would be a\\nvector of references (and it would also introduce a lifetime, to which we\'ll get\\nlater on). This is pretty similar with the operations on sets from the first week.\\n\\n#### Indexing `Vec`\\n\\nI will start with the indexing, cause bound-checking is a bit more\u2026 complicated\\nthan I would like to.\\n\\n```rust\\npub fn index<\'a, T, U>(v: &\'a [Vec], idx: &Vector2D) -> &\'a U\\nwhere\\n usize: TryFrom,\\n >::Error: Debug,\\n T: Copy,\\n{\\n let (x, y): (usize, usize) = (idx.x.try_into().unwrap(), idx.y.try_into().unwrap());\\n &v[y][x]\\n}\\n```\\n\\nLet\'s talk about this mess\u2026 Body of the function is probably the most easy part\\nand should not be hard to understand, we just take the `x` and `y` and convert\\nthem both to `usize` type that can be used later on for indexing.\\n\\nThe type signature of the function is where the fun is at :wink: We are trying\\nto convert unknown type to `usize`, so we must bound the `T` as a type that can\\nbe converted to `usize`, that\'s how we got `usize: TryFrom` which basically\\nsays that `usize` must implement `TryFrom` trait and therefore allows us to\\nconvert the indices to actual `usize` indices. Using `.unwrap()` also forces us\\nto bound the error that can occur when converting `T` into `usize`, that\'s how\\nwe get `>::Error: Debug` which loosely means\\n\\n> error during conversion of `T` into `usize` must implement `Debug`,\\n> i.e. can be printed in some way or other\\n\\n`T: Copy` is required by `.try_into()` which takes `T` by-value.\\n\\nAnd now we are left only with the first line of the definition.\\n\\n:::note\\n\\nSkilled Rustaceans might notice that this implementation is rather flaky and can\\nbreak in multiple places at once. I\'ll get back to it\u2026\\n\\n:::\\n\\nLet\'s split it in multiple parts:\\n\\n- `v: &\'a [Vec]` represents the 2D `Vec`, we are indexing, `Vec` implements\\n `Slice` trait and _clippy_ recommends using `&[T]` to `&Vec`, exact details\\n are unknown to me\\n- `idx: &Vector2D` represents the _indices_ which we use, we take them by\\n reference to avoid an unnecessary copy\\n- `-> &\'a U` means that we are returning a _reference_ to some value of type `U`.\\n Now the question is what does the `\'a` mean, we can also see it as a generic\\n type declared along `T` and `U`. And the answer is _relatively_ simple, `\'a`\\n represents a _lifetime_. We take the `v` by a reference and return a reference,\\n borrow checker validates all of the _borrows_ (or references), so we need to\\n specify that our returned value has _the same lifetime_ as the vector we have\\n taken by a reference, i.e. returned reference must live at least as long as the\\n `v`. This way we can \u201cbe sure\u201d that the returned reference is valid.\\n\\n##### Issues\\n\\nFirst issue that our implementation has is the fact that we cannot get a mutable\\nreference out of that function. This could be easily resolved by introducing new\\nfunction, e.g. `index_mut`. Which I have actually done while writing this part:\\n\\n```rust\\npub fn index_mut<\'a, T, U>(v: &\'a mut [Vec], idx: &Vector2D) -> &\'a mut U\\nwhere\\n usize: TryFrom,\\n >::Error: Debug,\\n T: Copy,\\n{\\n let (x, y): (usize, usize) = (idx.x.try_into().unwrap(), idx.y.try_into().unwrap());\\n &mut v[y][x]\\n}\\n```\\n\\n:::caution **\xab\u21af\xbb** Why can\'t we use one function?\\n\\nWhen we consider a `Vec`, we don\'t need to consider containers as `T`, Rust\\nimplements indexing as traits `Index` and `IndexMut` that do the dirty work\\nbehind syntactic sugar of `container[idx]`.\\n\\nHowever, implementing of traits is not allowed for _external_ types, i.e. types\\nthat you haven\'t defined yourself. This means that you can implement indexing\\nover containers that you have implemented yourself, but you cannot use your own\\ntypes for indexing \u201cbuilt-in\u201d types.\\n\\nAnother part of this rabbit hole is trait `SliceIndex` that is of a relevance\\nbecause of\\n\\n```rust\\nimpl Index for [T]\\nwhere\\n I: SliceIndex<[T]>\\n\\nimpl Index for Vec\\nwhere\\n I: SliceIndex<[T]>,\\n A: Allocator\\n\\nimpl Index for [T; N]\\nwhere\\n [T]: Index\\n```\\n\\nIn other words, if your type implements `SliceIndex` trait, it can be used\\nfor indexing. As of now, this trait has all of its required methods experimental\\nand is marked as `unsafe`.\\n\\n:::\\n\\nAnother problem is a requirement for indexing either `[Vec]` or `Vec>`.\\nThis requirement could be countered by removing inner type `Vec` and constraining\\nit by a trait `Index` (or `IndexMut` respectively) in a following way\\n\\n```rust\\npub fn index<\'a, C, T>(v: &\'a [C], idx: &Vector2D) -> &\'a C::Output\\nwhere\\n usize: TryFrom,\\n >::Error: Debug,\\n T: Copy,\\n C: Index\\n{\\n let (x, y): (usize, usize) = (idx.x.try_into().unwrap(), idx.y.try_into().unwrap());\\n &v[y][x]\\n}\\n```\\n\\nGiven this, we can also give a more meaningful typename for indexing type, such\\nas `I`.\\n\\n#### Checking bounds\\n\\nNow we can get to the boundary checks, it is very similar, but a more\u2026 dirty.\\nFirst approach that came up was to convert the indices in `Vector2D` to `usize`,\\nbut when you add the indices up, e.g. when checking the neighbors, you can end\\nup with negative values which, unlike in C++, causes an error (instead of underflow\\nthat you can use to your advantage; you can easily guess how).\\n\\nSo how can we approach this then? Well\u2026 we will convert the bounds instead of\\nthe indices and that lead us to:\\n\\n```rust\\npub fn in_range(v: &[Vec], idx: &Vector2D) -> bool\\nwhere\\n usize: TryInto,\\n >::Error: Debug,\\n T: PartialOrd + Copy,\\n{\\n idx.y >= 0.try_into().unwrap()\\n && idx.y < v.len().try_into().unwrap()\\n && idx.x >= 0.try_into().unwrap()\\n && idx.x\\n < v[TryInto::::try_into(idx.y).unwrap()]\\n .len()\\n .try_into()\\n .unwrap()\\n}\\n```\\n\\nYou can tell that it\'s definitely a shitty code. Let\'s improve it now! We will\\nget back to the original idea, but do it better. We know that we cannot convert\\nnegative values into `usize`, **but** we also know that conversion like that\\nreturns a `Result` which we can use to our advantage.\\n\\n```rust\\npub fn in_range(v: &[Vec], idx: &Vector2D) -> bool\\nwhere\\n T: Copy,\\n usize: TryFrom,\\n{\\n usize::try_from(idx.y)\\n .and_then(|y| usize::try_from(idx.x).map(|x| y < v.len() && x < v[y].len()))\\n .unwrap_or(false)\\n}\\n```\\n\\n`Result` is a type similar to `Either` in Haskell and it allows us to chain\\nmultiple operations on correct results or propagate the original error without\\ndoing anything. Let\'s dissect it one-by-one.\\n\\n`try_from` is a method implemented in `TryFrom` trait, that allows you to convert\\ntypes and either successfully convert them or fail (with a reasonable error). This\\nmethod returns `Result`.\\n\\nWe call `and_then` on that _result_, let\'s have a look at the type signature of\\n`and_then`, IMO it explains more than enough:\\n\\n```rust\\npub fn and_then(self, op: F) -> Result\\nwhere\\n F: FnOnce(T) -> Result\\n```\\n\\nOK\u2026 So it takes the result and a function and returns another result with\\ndifferent value and different error. However we can see that the function, which\\nrepresents an operation on a result, takes just the value, i.e. it doesn\'t care\\nabout any previous error. To make it short:\\n\\n> `and_then` allows us to run an operation, which can fail, on the correct result\\n\\nWe parsed a `y` index and now we try to convert the `x` index with `try_from`\\nagain, but on that result we use `map` rather than `and_then`, why would that be?\\n\\n```rust\\npub fn map(self, op: F) -> Result\\nwhere\\n F: FnOnce(T) -> U\\n```\\n\\nHuh\u2026 `map` performs an operation that **cannot** fail. And finally we use\\n`unwrap_or` which takes the value from result, or in case of an error returns the\\ndefault that we define.\\n\\nHow does this work then? If `y` is negative, the conversion fails and the error\\npropagates all the way to `unwrap_or`, if `y` can be a correct `usize` value, then\\nwe do the same with `x`. If `x` is negative, we propagate the error as with `y`,\\nand if it\'s not, then we check whether it exceeds the higher bounds or not.\\n\\n### Solution\\n\\nRelatively simple, you just need follow the rules and not get too smart, otherwise\\nit will get back at you.\\n\\n## [Day 9: Rope Bridge](https://adventofcode.com/2022/day/9)\\n\\n:::info tl;dr\\n\\nWe get a rope with knots and we want to track how many different positions are\\nvisited with the rope\'s tail.\\n\\n:::\\n\\nBy this day, I have come to a conclusion that current skeleton for each day\\ngenerates a lot of boilerplate. And even though it can be easily copied, it\'s\\njust a waste of space and unnecessary code. Let\'s \u201csimplify\u201d this (on one end\\nwhile creating monster on the other end). I\'ve gone through what we need in the\\npreparations for the AoC. Let\'s sum up our requirements:\\n\\n- parsing\\n- part 1 & 2\\n- running on sample / input\\n- tests\\n\\nParsing and implementation of both parts is code that changes each day and we\\ncannot do anything about it. However running and testing can be simplified!\\n\\nLet\'s introduce and export a new module `solution` that will take care of all of\\nthis. We will start by introducing a trait for each day.\\n\\n```rust\\npub trait Solution {\\n fn parse_input>(pathname: P) -> Input;\\n\\n fn part_1(input: &Input) -> Output;\\n fn part_2(input: &Input) -> Output;\\n}\\n```\\n\\nThis does a lot of work for us already, we have defined a trait and for each day\\nwe will create a structure representing a specific day. That structure will also\\nimplement the `Solution` trait.\\n\\nNow we need to get rid of the boilerplate, we can\'t get rid of the `main` function,\\nbut we can at least move out the functionality.\\n\\n```rust\\nfn run(type_of_input: &str) -> Result<()>\\nwhere\\n Self: Sized,\\n{\\n tracing_subscriber::fmt()\\n .with_env_filter(EnvFilter::from_default_env())\\n .with_target(false)\\n .with_file(true)\\n .with_line_number(true)\\n .without_time()\\n .compact()\\n .init();\\n color_eyre::install()?;\\n\\n let input = Self::parse_input(format!(\\"{}s/{}.txt\\", type_of_input, Self::day()));\\n\\n info!(\\"Part 1: {}\\", Self::part_1(&input));\\n info!(\\"Part 2: {}\\", Self::part_2(&input));\\n\\n Ok(())\\n}\\n\\nfn main() -> Result<()>\\nwhere\\n Self: Sized,\\n{\\n Self::run(\\"input\\")\\n}\\n```\\n\\nThis is all part of the `Solution` trait, which can implement methods while being\\ndependent on what is provided by the implementing types. In this case, we just\\nneed to bound the `Output` type to implement `Display` that is necessary for the\\n`info!` and format string there.\\n\\nNow we can get to first of the nasty things we are going to do\u2026 And it is the\\n`day()` method that you can see being used when constructing path to the input\\nfile. That method will generate a name of the file, e.g. `day01` and we know that\\nwe can _somehow_ deduce it from the structure name, given we name it reasonably.\\n\\n```rust\\nfn day() -> String {\\n let mut day = String::from(type_name::().split(\\"::\\").next().unwrap());\\n day.make_ascii_lowercase();\\n\\n day.to_string()\\n}\\n```\\n\\n:::caution `type_name`\\n\\nThis feature is still experimental and considered to be internal, it is not\\nadvised to use it any production code.\\n\\n:::\\n\\nAnd now we can get to the nastiest stuff :weary: We will **generate** the tests!\\n\\nWe want to be able to generate tests for sample input in a following way:\\n\\n```rust\\ntest_sample!(day_01, Day01, 42, 69);\\n```\\n\\nThere\'s not much we can do, so we will write a macro to generate the tests for us.\\n\\n```rust\\n#[macro_export]\\nmacro_rules! test_sample {\\n ($mod_name:ident, $day_struct:tt, $part_1:expr, $part_2:expr) => {\\n #[cfg(test)]\\n mod $mod_name {\\n use super::*;\\n\\n #[test]\\n fn test_part_1() {\\n let sample =\\n $day_struct::parse_input(&format!(\\"samples/{}.txt\\", $day_struct::day()));\\n assert_eq!($day_struct::part_1(&sample), $part_1);\\n }\\n\\n #[test]\\n fn test_part_2() {\\n let sample =\\n $day_struct::parse_input(&format!(\\"samples/{}.txt\\", $day_struct::day()));\\n assert_eq!($day_struct::part_2(&sample), $part_2);\\n }\\n }\\n };\\n}\\n```\\n\\nWe have used it in a similar way as macros in C/C++, one of the things that we\\ncan use to our advantage is defining \u201ctype\u201d of the parameters for the macro. All\\nparameters have their name prefixed with `$` sign and you can define various \u201cforms\u201d\\nof your macro. Let\'s go through it!\\n\\nWe have following parameters:\\n\\n- `$mod_name` which represents the name for the module with tests, it is typed\\n with `ident` which means that we want a valid identifier to be passed in.\\n- `$day_struct` represents the structure that will be used for tests, it is typed\\n with `tt` which represents a _token tree_, in our case it is a type.\\n- `$part_X` represents the expected output for the `X`th part and is of type `expr`\\n which literally means an _expression_.\\n\\nApart from that we need to use `#[macro_export]` to mark the macro as exported\\nfor usage outside of the module. Now our skeleton looks like:\\n\\n```rust\\nuse aoc_2022::*;\\n\\ntype Input = String;\\ntype Output = String;\\n\\nstruct DayXX;\\nimpl Solution for DayXX {\\n fn parse_input>(pathname: P) -> Input {\\n file_to_string(pathname)\\n }\\n\\n fn part_1(input: &Input) -> Output {\\n todo!()\\n }\\n\\n fn part_2(input: &Input) -> Output {\\n todo!()\\n }\\n}\\n\\nfn main() -> Result<()> {\\n // DayXX::run(\\"sample\\")\\n DayXX::main()\\n}\\n\\n// test_sample!(day_XX, DayXX, , );\\n```\\n\\n### Solution\\n\\nNot much to talk about, it is relatively easy to simulate.\\n\\n## [Day 10: Cathode-Ray Tube](https://adventofcode.com/2022/day/10)\\n\\n:::info tl;dr\\n\\nEmulating basic arithmetic operations on a CPU and drawing on CRT based on the\\nCPU\'s accumulator.\\n\\n:::\\n\\nIn this day I have discovered an issue with my design of the `Solution` trait.\\nAnd the issue is caused by different types of `Output` for the part 1 and part 2.\\n\\nProblem is relatively simple and consists of simulating a CPU, I have approached\\nit in a following way:\\n\\n```rust\\nfn evaluate_instructions(instructions: &[Instruction], mut out: Output) -> Output {\\n instructions\\n .iter()\\n .fold(State::new(), |state, instruction| {\\n state.execute(instruction, &mut out)\\n });\\n\\n out\\n}\\n```\\n\\nWe just take the instructions, we have some state of the CPU and we execute the\\ninstructions one-by-one. Perfect usage of the `fold` (or `reduce` as you may know\\nit from other languages).\\n\\nYou can also see that we have an `Output` type, so the question is how can we fix\\nthat problem. And the answer is very simple and _functional_. Rust allows you to\\nhave an `enumeration` that can _bear_ some other values apart from the type itself.\\n\\n:::tip\\n\\nWe could\'ve seen something like this with the `Result` type that can be\\ndefined as\\n\\n```rust\\nenum Result {\\n Ok(T),\\n Err(E)\\n}\\n```\\n\\n###### What does that mean though?\\n\\nWhen we have an `Ok` value, it has the result itself, and when we get an `Err`\\nvalue, it has the error. This also allows us to handle _results_ in a rather\\npretty way:\\n\\n```rust\\nmatch do_something(x) {\\n Ok(y) => {\\n println!(\\"SUCCESS: {}\\", y);\\n },\\n Err(y) => {\\n eprintln!(\\"ERROR: {}\\", y);\\n }\\n}\\n```\\n\\n:::\\n\\nMy solution has a following outline:\\n\\n```rust\\nfn execute(&self, i: &Instruction, output: &mut Output) -> State {\\n // execute the instruction\\n\\n // collect results if necessary\\n match output {\\n Output::Part1(x) => self.execute_part_1(y, x),\\n Output::Part2(x) => self.execute_part_2(y, x),\\n }\\n\\n // return the obtained state\\n new_state\\n}\\n```\\n\\nYou might think that it\'s a perfectly reasonable thing to do. Yes, **but** notice\\nthat the `match` statement doesn\'t _collect_ the changes in any way and also we\\npass `output` by `&mut`, so it is shared across each _iteration_ of the `fold`.\\n\\nThe dirty and ingenious thing is that `x`s are passed by `&mut` too and therefore\\nthey are directly modified by the helper functions. To sum it up and let it sit\\n\\n> We are **collecting** the result **into** an **enumeration** that is **shared**\\n> across **all** iterations of `fold`.\\n\\n### Solution\\n\\nSimilar to _Day 9_, but there are some technical details that can get you.\\n\\n## [Day 11: Monkey in the Middle](https://adventofcode.com/2022/day/11)\\n\\n:::info tl;dr\\n\\nSimulation of monkeys throwing stuff around and measuring your stress levels\\nwhile your stuff is being passed around.\\n\\n:::\\n\\nI think I decided to use regular expressions here for the first time, cause\\nparsing the input was a pain.\\n\\nAlso I didn\'t expect to implement Euclidean algorithm in Rust\u2026\\n\\n### Solution\\n\\nAgain, we\'re just running a simulation. Though I must admit it was very easy to\\nmake a small technical mistakes that could affect the final results very late.\\n\\n## [Day 12: Hill Climbing Algorithm](https://adventofcode.com/2022/day/12)\\n\\n:::info tl;dr\\n\\nFinding shortest path up the hill and also shortest path down to the ground while\\nalso rolling down the hill\u2026\\n\\n:::\\n\\nAs I have said in the _tl;dr_, we are looking for the shortest path, but the start\\nand goal differ for the part 1 and 2. So I have decided to refactor my solution\\nto a BFS algorithm that takes necessary parameters via functions:\\n\\n```rust\\nfn bfs(\\n graph: &[Vec], start: &Position, has_edge: F, is_target: G\\n) -> Option\\nwhere\\n F: Fn(&[Vec], &Position, &Position) -> bool,\\n G: Fn(&[Vec], &Position) -> bool\\n```\\n\\nWe pass the initial vertex from the caller and everything else is left to the BFS\\nalgorithm, based on the `has_edge` and `is_target` functions.\\n\\nThis was easy! And that is not very usual in Rust once you want to pass around\\nfunctions. :eyes:\\n\\n### Solution\\n\\nLooking for the shortest path\u2026 Must be Dijkstra, right? **Nope!** Half of the\\nReddit got jebaited though. In all fairness, nothing stops you from implementing\\nthe Dijkstra\'s algorithm for finding the shortest path, **but** if you know that\\nall connected vertices are in a unit (actually $d = 1$) distance from each other,\\nthen you know that running Dijkstra is equivalent to running BFS, only with worse\\ntime complexity, because of the priority heap instead of the queue.\\n\\n## [Day 13: Distress Signal](https://adventofcode.com/2022/day/13)\\n\\n:::info tl;dr\\n\\nProcessing packets with structured data from the distress signal.\\n\\n:::\\n\\nYou can implement a lot of traits if you want to. It is _imperative_ to implement\\nordering on the packets. I had a typo, so I also proceeded to implement a `Display`\\ntrait for debugging purposes:\\n\\n```rust\\nimpl Display for Packet {\\n fn fmt(&self, f: &mut std::fmt::Formatter<\'_>) -> std::fmt::Result {\\n match self {\\n Packet::Integer(x) => write!(f, \\"{x}\\"),\\n Packet::List(lst) => write!(f, \\"[{}]\\", lst.iter().map(|p| format!(\\"{p}\\")).join(\\",\\")),\\n }\\n }\\n}\\n```\\n\\n### Solution\\n\\nA lot of technical details\u2026 Parsing is nasty too\u2026\\n\\n## [Day 14: Regolith Reservoir](https://adventofcode.com/2022/day/14)\\n\\n:::info tl;dr\\n\\nLet\'s simulate falling sand grain-by-grain.\\n\\n:::\\n\\nAgain, both parts are relatively similar with minimal changes, so it is a good\\nidea to refactor it a bit. Similar approach to the [BFS above]. Also this is the\\nfirst day where I ran into efficiency issues and had to redo my solution to speed\\nit up just a bit.\\n\\n### Solution\\n\\nTedious.\\n\\n## Post Mortem\\n\\n### Indexing\\n\\nI was asked about the indexing after publishing the blog. And truly it is rather\\ncomplicated topic, especially after releasing `SliceIndex` trait. I couldn\'t\\nleave it be, so I tried to implement the `Index` and `IndexMut` trait.\\n\\n:::note\\n\\nI have also mentioned that the `SliceIndex` trait is `unsafe`, but truth be told,\\nonly _unsafe_ part are the 2 methods that are named `*unchecked*`. Anyways, I will\\nbe implementing the `Index*` traits for now, rather than the `SliceIndex`.\\n\\n:::\\n\\nIt\'s relatively straightforward\u2026\\n\\n```rust\\nimpl Index> for [C]\\nwhere\\n I: Copy + TryInto,\\n >::Error: Debug,\\n C: Index,\\n{\\n type Output = C::Output;\\n\\n fn index(&self, index: Vector2D) -> &Self::Output {\\n let (x, y): (usize, usize) =\\n (index.x.try_into().unwrap(), index.y.try_into().unwrap());\\n &self[y][x]\\n }\\n}\\n\\nimpl IndexMut> for [C]\\nwhere\\n I: Copy + TryInto,\\n >::Error: Debug,\\n C: IndexMut,\\n{\\n fn index_mut(&mut self, index: Vector2D) -> &mut Self::Output {\\n let (x, y): (usize, usize) =\\n (index.x.try_into().unwrap(), index.y.try_into().unwrap());\\n &mut self[y][x]\\n }\\n}\\n```\\n\\nWe can see a lot of similarities to the implementation of `index` and `index_mut`\\nfunctions. In the end, they are 1:1, just wrapped in the trait that provides a\\nsyntax sugar for `container[idx]`.\\n\\n:::note\\n\\nI have also switched from using the `TryFrom` to `TryInto` trait, since it better\\nmatches what we are using, the `.try_into` rather than `usize::try_from`.\\n\\nAlso implementing `TryFrom` automatically provides you with a `TryInto` trait,\\nsince it is relatively easy to implement. Just compare the following:\\n\\n```rust\\npub trait TryFrom: Sized {\\n type Error;\\n\\n fn try_from(value: T) -> Result;\\n}\\n\\npub trait TryInto: Sized {\\n type Error;\\n\\n fn try_into(self) -> Result;\\n}\\n```\\n\\n:::\\n\\nOK, so we have our trait implemented, we should be able to use `container[index]`,\\nright? Yes\u2026 but actually no :frowning:\\n\\n```\\nerror[E0277]: the type `[std::vec::Vec]` cannot be indexed by `aoc_2022::Vector2D`\\n --\x3e src/bin/day08.rs:26:18\\n |\\n26 | if trees[pos] > tallest {\\n | ^^^ slice indices are of type `usize` or ranges of `usize`\\n |\\n = help: the trait `std::slice::SliceIndex<[std::vec::Vec]>` is not implemented for `aoc_2022::Vector2D`\\n = note: required for `std::vec::Vec>` to implement `std::ops::Index>`\\n\\nerror[E0277]: the type `[std::vec::Vec]` cannot be indexed by `aoc_2022::Vector2D`\\n --\x3e src/bin/day08.rs:30:28\\n |\\n30 | max(tallest, trees[pos])\\n | ^^^ slice indices are of type `usize` or ranges of `usize`\\n |\\n = help: the trait `std::slice::SliceIndex<[std::vec::Vec]>` is not implemented for `aoc_2022::Vector2D`\\n = note: required for `std::vec::Vec>` to implement `std::ops::Index>`\\n\\nerror[E0277]: the type `[std::vec::Vec]` cannot be indexed by `aoc_2022::Vector2D`\\n --\x3e src/bin/day08.rs:52:28\\n |\\n52 | let max_height = trees[position];\\n | ^^^^^^^^ slice indices are of type `usize` or ranges of `usize`\\n |\\n = help: the trait `std::slice::SliceIndex<[std::vec::Vec]>` is not implemented for `aoc_2022::Vector2D`\\n = note: required for `std::vec::Vec>` to implement `std::ops::Index>`\\n```\\n\\nWhy? We have it implemented for the slices (`[C]`), why doesn\'t it work? Well,\\nthe fun part consists of the fact that in other place, where we were using it,\\nwe were passing the `&[Vec]`, but this is coming from a helper functions that\\ntake `&Vec>` instead. And\u2026 we don\'t implement `Index` and `IndexMut` for\\nthose. Just for the slices. \ud83e\udd2f _What are we going to do about it?_\\n\\nWe can either start copy-pasting or be smarter about it\u2026 I choose to be smarter,\\nso let\'s implement a macro! The only difference across the implementations are\\nthe types of the outer containers. Implementation doesn\'t differ **at all**!\\n\\nImplementing the macro can be done in a following way:\\n\\n```rust\\nmacro_rules! generate_indices {\\n ($container:ty) => {\\n impl Index> for $container\\n where\\n I: Copy + TryInto,\\n >::Error: Debug,\\n C: Index,\\n {\\n type Output = C::Output;\\n\\n fn index(&self, index: Vector2D) -> &Self::Output {\\n let (x, y): (usize, usize) =\\n (index.x.try_into().unwrap(), index.y.try_into().unwrap());\\n &self[y][x]\\n }\\n }\\n\\n impl IndexMut> for $container\\n where\\n I: Copy + TryInto,\\n >::Error: Debug,\\n C: IndexMut,\\n {\\n fn index_mut(&mut self, index: Vector2D) -> &mut Self::Output {\\n let (x, y): (usize, usize) =\\n (index.x.try_into().unwrap(), index.y.try_into().unwrap());\\n &mut self[y][x]\\n }\\n }\\n };\\n}\\n```\\n\\nAnd now we can simply do\\n\\n```rust\\ngenerate_indices!(VecDeque);\\ngenerate_indices!([C]);\\ngenerate_indices!(Vec);\\n// generate_indices!([C; N], const N: usize);\\n```\\n\\nThe last type (I took the inspiration from the implementations of the `Index` and\\n`IndexMut` traits) is a bit problematic, because of the `const N: usize` part,\\nwhich I haven\'t managed to be able to parse. And that\'s how I got rid of the error.\\n\\n:::note\\n\\nIf I were to use 2D-indexing over `[C; N]` slices, I\'d probably just go with the\\ncopy-paste, cause the cost of this \u201cmonstrosity\u201d outweighs the benefits of no DRY.\\n\\n:::\\n\\n#### Cause of the problem\\n\\nThis issue is relatively funny. If you don\'t use any type aliases, just the raw\\ntypes, you\'ll get suggested certain changes by the _clippy_. For example if you\\nconsider the following piece of code\\n\\n```rust\\nfn get_sum(nums: &Vec) -> i32 {\\n nums.iter().sum()\\n}\\n\\nfn main() {\\n let nums = vec![1, 2, 3];\\n println!(\\"Sum: {}\\", get_sum(&nums));\\n}\\n```\\n\\nand you run _clippy_ on it, you will get\\n\\n```\\nChecking playground v0.0.1 (/playground)\\nwarning: writing `&Vec` instead of `&[_]` involves a new object where a slice will do\\n --\x3e src/main.rs:1:18\\n |\\n1 | fn get_sum(nums: &Vec) -> i32 {\\n | ^^^^^^^^^ help: change this to: `&[i32]`\\n |\\n = help: for further information visit https://rust-lang.github.io/rust-clippy/master/index.html#ptr_arg\\n = note: `#[warn(clippy::ptr_arg)]` on by default\\n\\nwarning: `playground` (bin \\"playground\\") generated 1 warning\\n Finished dev [unoptimized + debuginfo] target(s) in 0.61s\\n```\\n\\nHowever, if you introduce a type alias, such as\\n\\n```rust\\ntype Numbers = Vec;\\n```\\n\\nThen _clippy_ won\'t say anything, cause there is literally nothing to suggest.\\nHowever the outcome is not the same\u2026\\n\\n[_advent of code_]: https://adventofcode.com\\n[bfs above]: #day-12-hill-climbing-algorithm"},{"id":"aoc-2022/1st-week","metadata":{"permalink":"/blog/aoc-2022/1st-week","editUrl":"https://github.com/mfocko/blog/tree/main/blog/aoc-2022/01-week-1.md","source":"@site/blog/aoc-2022/01-week-1.md","title":"1st week of Advent of Code \'22 in Rust","description":"Surviving first week in Rust.","date":"2022-12-15T01:15:00.000Z","formattedDate":"December 15, 2022","tags":[{"label":"advent-of-code","permalink":"/blog/tags/advent-of-code"},{"label":"advent-of-code-2022","permalink":"/blog/tags/advent-of-code-2022"},{"label":"rust","permalink":"/blog/tags/rust"}],"readingTime":12.4,"hasTruncateMarker":true,"authors":[{"name":"Matej Focko","email":"me+blog@mfocko.xyz","title":"a.k.a. @mf","url":"https://gitlab.com/mfocko","imageURL":"https://github.com/mfocko.png","key":"mf"}],"frontMatter":{"title":"1st week of Advent of Code \'22 in Rust","description":"Surviving first week in Rust.","date":"2022-12-15T01:15","slug":"aoc-2022/1st-week","authors":"mf","tags":["advent-of-code","advent-of-code-2022","rust"],"hide_table_of_contents":false},"unlisted":false,"prevItem":{"title":"2nd week of Advent of Code \'22 in Rust","permalink":"/blog/aoc-2022/2nd-week"},"nextItem":{"title":"Advent of Code \'22 in Rust","permalink":"/blog/aoc-2022/intro"}},"content":"Let\'s go through the first week of [_Advent of Code_] in Rust.\\n\\n\x3c!--truncate--\x3e\\n\\n:::note\\n\\nIf you wish to have a look at the solutions, you can follow them on my [GitLab].\\nMore specifically in the [`/src/bin/`].\\n\\n:::\\n\\nI will try to summarize my experience with using Rust for the AoC. Trying it out\\nages ago, I believe it will be _pain and suffering_, but we will see. For each\\nday I will also try to give a tl;dr of the problem, so that you can better imagine\\nthe relation to my woes or :+1: moments.\\n\\n## [Day 1: Calorie Counting](https://adventofcode.com/2022/day/1)\\n\\n:::info tl;dr\\n\\nAs the name suggests, we get the calories of the food contained in the elves\\nbackpacks and we want to choose the elf that has the most food ;)\\n\\n:::\\n\\n> Wakey wakey!\\n\\nProgramming in Rust at 6am definitely hits. I\'ve also forgotten to mention how I\\nhandle samples. With each puzzle you usually get a sample input and expected\\noutput. You can use them to verify that your solution works, or usually doesn\'t.\\n\\nAt first I\'ve decided to put asserts into my `main`, something like\\n\\n```rust\\nassert_eq!(part_1(&sample), 24000);\\ninfo!(\\"Part 1: {}\\", part_1(&input));\\n\\nassert_eq!(part_2(&sample), 45000);\\ninfo!(\\"Part 2: {}\\", part_2(&input));\\n```\\n\\nHowever, once you get further, the sample input may take some time to run itself.\\nSo in the end, I have decided to turn them into unit tests:\\n\\n```rust\\n#[cfg(test)]\\nmod tests {\\n use super::*;\\n\\n #[test]\\n fn test_part_1() {\\n let sample = parse_input(\\"samples/day01.txt\\");\\n assert_eq!(part_1(&sample), 24000);\\n }\\n\\n #[test]\\n fn test_part_2() {\\n let sample = parse_input(\\"samples/day01.txt\\");\\n assert_eq!(part_2(&sample), 45000);\\n }\\n}\\n```\\n\\nAnd later on I have noticed, it\'s hard to tell the difference between the days,\\nso I further renamed the `mod` from generic `tests` to reflect the days.\\n\\nAlso after finishing the first day puzzle, I have installed an [`sccache`] to\\ncache the builds, so that the build time is lower, cause it was kinda unbearable.\\n\\n### Solution\\n\\nWell, it\'s a pretty simple problem. You just take the input, sum the calories and\\nfind the biggest one. However, if we try to generalize to more than the biggest\\none, the fun appears. We have few options:\\n\\n- keep all the calories, sort them, take what we need\\n- keep all the calories and use max heap\\n- use min heap and maintain at most N calories that we need\\n\\n## [Day 2: Rock Paper Scissors](https://adventofcode.com/2022/day/2)\\n\\n:::info tl;dr\\n\\nYou want to know what score did you achieve while playing _Rock Paper Scissors_.\\nAnd then you want to be strategic about it.\\n\\n:::\\n\\nApart from the technical details of the puzzle, it went relatively smooth.\\n\\n### Solution\\n\\nI took relatively na\xefve approach and then tried to simplify it.\\n\\n## [Day 3: Rucksack Reorganization](https://adventofcode.com/2022/day/3)\\n\\n:::info tl;dr\\n\\nLet\'s go reorganize elves\' backpacks! Each backpacks has 2 compartments and you\\nwant to find the common item among those compartments. Each of them has priority,\\nyou care only about the sum.\\n\\n:::\\n\\nThis is the day where I started to fight the compiler and neither of us decided\\nto give up. Let\'s dive into it \\\\o/\\n\\n:::tip Fun fact\\n\\nFighting the compiler took me 30 minutes.\\n\\n:::\\n\\nWe need to find a common item among 2 collections, that\'s an easy task, right?\\nWe can construct 2 sets and find an intersection:\\n\\n```rust\\nlet top: HashSet = [1, 2, 3].iter().collect();\\nlet bottom: HashSet = [3, 4, 5].iter().collect();\\n```\\n\\nNow, the first issue that we encounter is caused by the fact that we are using\\na slice (the `[\u2026]`), iterator of that returns **references** to the numbers.\\nAnd we get immediately yelled at by the compiler, because the numbers are discarded\\nafter running the `.collect`. To fix this, we can use `.into_iter`:\\n\\n```rust\\nlet top: HashSet = [1, 2, 3].into_iter().collect();\\nlet bottom: HashSet = [3, 4, 5].into_iter().collect();\\n```\\n\\nThis way the numbers will get copied instead of referenced. OK, let\'s find the\\nintersection of those 2 collections:\\n\\n```rust\\nprintln!(\\"Common elements: {:?}\\", top.intersection(&bottom));\\n```\\n\\n```\\nCommon elements: [3]\\n```\\n\\n:::caution\\n\\nNotice that we need to do `&bottom`. It explicitly specifies that `.intersection`\\n**borrows** the `bottom`, i.e. takes an immutable reference to it.\\n\\n:::\\n\\nThat\'s what we want, right? Looks like it! \\\\o/\\n\\nNext part wants us to find the common element among all of the backpacks. OK, so\\nthat should be fairly easy, we have an intersection and we want to find intersection\\nover all of them.\\n\\nLet\'s have a look at the type of the `.intersection`\\n\\n```rust\\npub fn intersection<\'a>(\\n\xa0\xa0\xa0\xa0&\'a self,\\n\xa0\xa0\xa0\xa0other: &\'a HashSet\\n) -> Intersection<\'a, T, S>\\n```\\n\\nOK\u2026 Huh\u2026 But we have an example there!\\n\\n```rust\\nlet intersection: HashSet<_> = a.intersection(&b).collect();\\n```\\n\\nCool, that\'s all we need.\\n\\n```rust\\nlet top: HashSet = [1, 2, 3, 4].into_iter().collect();\\nlet bottom: HashSet = [3, 4, 5, 6].into_iter().collect();\\nlet top_2: HashSet = [2, 3, 4, 5, 6].into_iter().collect();\\nlet bottom_2: HashSet = [4, 5, 6].into_iter().collect();\\n\\nlet intersection: HashSet<_> = top.intersection(&bottom).collect();\\nprintln!(\\"Intersection: {:?}\\", intersection);\\n```\\n\\n```\\nIntersection: {3, 4}\\n```\\n\\nCool, so let\'s do the intersection with the `top_2`:\\n\\n```rust\\nlet top: HashSet = [1, 2, 3, 4].into_iter().collect();\\nlet bottom: HashSet = [3, 4, 5, 6].into_iter().collect();\\nlet top_2: HashSet = [2, 3, 4, 5, 6].into_iter().collect();\\nlet bottom_2: HashSet = [4, 5, 6].into_iter().collect();\\n\\nlet intersection: HashSet<_> = top.intersection(&bottom).collect();\\nlet intersection: HashSet<_> = intersection.intersection(&top_2).collect();\\nprintln!(\\"Intersection: {:?}\\", intersection);\\n```\\n\\nAnd we get yelled at by the compiler:\\n\\n```\\nerror[E0308]: mismatched types\\n --\x3e src/main.rs:10:58\\n |\\n10 | let intersection: HashSet<_> = intersection.intersection(&top_2).collect();\\n | ------------ ^^^^^^ expected `&i32`, found `i32`\\n | |\\n | arguments to this function are incorrect\\n |\\n = note: expected reference `&HashSet<&i32>`\\n found reference `&HashSet`\\n```\\n\\n/o\\\\ What the hell is going on here? Well, the funny thing is, that this operation\\ndoesn\'t return the elements themselves, but the references to them and when we pass\\nthe third set, it has just the values themselves, without any references.\\n\\n:::tip\\n\\nIt may seem as a very weird decision, but in fact it makes some sense\u2026 It allows\\nyou to do intersection of items that may not be possible to copy. Overall this is\\na \u201ctax\u201d for having a borrow checker ~~drilling your ass~~ having your back and\\nmaking sure you\'re not doing something naughty that may cause an **undefined**\\n**behavior**.\\n\\n:::\\n\\nTo resolve this we need to get an iterator that **clones** the elements:\\n\\n```rust\\nlet top: HashSet = [1, 2, 3, 4].into_iter().collect();\\nlet bottom: HashSet = [3, 4, 5, 6].into_iter().collect();\\nlet top_2: HashSet = [2, 3, 4, 5, 6].into_iter().collect();\\nlet bottom_2: HashSet = [4, 5, 6].into_iter().collect();\\n\\nlet intersection: HashSet<_> = top.intersection(&bottom).cloned().collect();\\nlet intersection: HashSet<_> = intersection.intersection(&top_2).cloned().collect();\\nlet intersection: HashSet<_> = intersection.intersection(&bottom_2).cloned().collect();\\nprintln!(\\"Intersection: {:?}\\", intersection);\\n```\\n\\n```\\nIntersection: {4}\\n```\\n\\n### Solution\\n\\nThe approach is pretty simple, if you omit the _1on1 with the compiler_. You just\\nhave some fun with the set operations :)\\n\\n## [Day 4: Camp Cleanup](https://adventofcode.com/2022/day/4)\\n\\n:::info tl;dr\\n\\nElves are cleaning up the camp and they got overlapping sections to clean up.\\nFind how many overlap and can take the day off.\\n\\n:::\\n\\n[`RangeInclusive`] is your friend not an enemy :)\\n\\n### Solution\\n\\nRelatively easy, you just need to parse the input and know what you want. Rust\'s\\n`RangeInclusive` type helped a lot, cause it took care of all abstractions.\\n\\n## [Day 5: Supply Stacks](https://adventofcode.com/2022/day/5)\\n\\n:::info tl;dr\\n\\nLet\'s play with stacks of crates.\\n\\n:::\\n\\nVery easy problem with very annoying input. You can judge yourself:\\n\\n```\\n [D]\\n[N] [C]\\n[Z] [M] [P]\\n 1 2 3\\n\\nmove 1 from 2 to 1\\nmove 3 from 1 to 3\\nmove 2 from 2 to 1\\nmove 1 from 1 to 2\\n```\\n\\nGood luck transforming that into something reasonable :)\\n\\n:::tip Fun fact\\n\\nTook me 40 minutes to parse this reasonably, including fighting the compiler.\\n\\n:::\\n\\n### Solution\\n\\nFor the initial solution I went with a manual solution (as in _I have done all_\\n_the work_. Later on I have decided to explore the `std` and interface of the\\n`std::vec::Vec` and found [`split_off`] which takes an index and splits (duh)\\nthe vector:\\n\\n```rust\\nlet mut vec = vec![1, 2, 3];\\nlet vec2 = vec.split_off(1);\\nassert_eq!(vec, [1]);\\nassert_eq!(vec2, [2, 3]);\\n```\\n\\nThis helped me simplify my solution a lot and also get rid of some _edge cases_.\\n\\n## [Day 6: Tuning Trouble](https://adventofcode.com/2022/day/6)\\n\\n:::info tl;dr\\n\\nFinding start of the message in a very weird protocol. Start of the message is\\ndenoted by $N$ unique consecutive characters.\\n\\n:::\\n\\n### Solution\\n\\nA lot of different approaches, knowing that we are dealing with input consisting\\nsolely of ASCII letters, I bit the bullet and went with sliding window and\\nconstructing sets from that window, checking if the set is as big as the window.\\n\\nOne possible optimization could consist of keeping a bit-vector (i.e. `usize`\\nvariable) of encountered characters and updating it as we go. However this has\\na different issue and that is removal of the characters from the left side of the\\nwindow. We don\'t know if the same character is not included later on.\\n\\nOther option is to do similar thing, but keeping the frequencies of the letters,\\nand again knowing we have only ASCII letters we can optimize by having a vector\\nof 26 elements that keeps count for each lowercase letter.\\n\\n## [Day 7: No Space Left On Device](https://adventofcode.com/2022/day/7)\\n\\n:::info tl;dr\\n\\nLet\'s simulate [`du`] to get some stats about our file system and then pinpoint\\ndirectories that take a lot of space and should be deleted.\\n\\n:::\\n\\n> I was waiting for this moment, and yet it got me!\\n> _imagine me swearing for hours_\\n\\n### Solution\\n\\nWe need to \u201c_build_\u201d a file system from the input that is given in a following form:\\n\\n```\\n$ cd /\\n$ ls\\ndir a\\n14848514 b.txt\\n8504156 c.dat\\ndir d\\n$ cd a\\n$ ls\\ndir e\\n29116 f\\n2557 g\\n62596 h.lst\\n$ cd e\\n$ ls\\n584 i\\n$ cd ..\\n$ cd ..\\n$ cd d\\n$ ls\\n4060174 j\\n8033020 d.log\\n5626152 d.ext\\n7214296 k\\n```\\n\\nThere are few ways in which you can achieve this and also you can assume some\\npreconditions, but why would we do that, right? :)\\n\\nYou can \u201cslap\u201d this in either [`HashMap`] or [`BTreeMap`] and call it a day.\\nAnd that would be boring\u2026\\n\\n:::tip\\n\\n`BTreeMap` is quite fitting for this, don\'t you think?\\n\\n:::\\n\\nI always wanted to try allocation on heap in Rust, so I chose to implement a tree.\\nI fought with the `Box` for some time and was losing\u2026\\n\\nThen I looked up some implementations of trees or linked lists and decided to try\\n`Rc>`. And I got my _ass whopped_ by the compiler once again. /o\\\\\\n\\n:::tip\\n\\n`Box` represents a dynamically allocated memory on heap. It is a single pointer,\\nyou can imagine this as `std::unique_ptr` in C++.\\n\\n`Rc` represents a dynamically allocated memory on heap. On top of that it is\\n_reference counted_ (that\'s what the `Rc` stands for). You can imagine this as\\n`std::shared_ptr` in C++.\\n\\nNow the fun stuff. Neither of them lets you **mutate** the contents of the memory.\\n\\n`Cell` allows you to mutate the memory. Can be used reasonably with types that\\ncan be copied, because the memory safety is guaranteed by copying the contents\\nwhen there is more than one **mutable** reference to the memory.\\n\\n`RefCell` is similar to the `Cell`, but the borrowing rules (how many mutable\\nreferences are present) are checked dynamically.\\n\\nSo in the end, if you want something like `std::shared_ptr` in Rust, you want\\nto have `Rc>`.\\n\\n:::\\n\\nSo, how are we going to represent the file system then? We will use an enumeration,\\nhehe, which is an algebraic data type that can store some stuff in itself :weary:\\n\\n```rust\\ntype FileHandle = Rc>;\\n\\n#[derive(Debug)]\\nenum AocFile {\\n File(usize),\\n Directory(BTreeMap),\\n}\\n```\\n\\nLet\'s go over it! `FileHandle` represents dynamically allocated `AocFile`, not\\nmuch to discuss. What does the `#[derive(Debug)]` do though? It lets us to print\\nout the value of that enumeration, it\'s derived, so it\'s not as good as if we had\\nimplemented it ourselves, but it\'s good enough for debugging, hence the name.\\n\\nNow to the fun part! `AocFile` value can be represented in two ways:\\n\\n- `File(usize)`, e.g. `AocFile::File(123)` and we can pattern match it, if we\\n need to\\n- `Directory(BTreeMap)` will represent the directory and will\\n contain map matching the name of the files (or directories) within to their\\n respective file handles\\n\\nI will omit the details about constructing this file system, cause there are a lot\\nof technicalities introduced by the nature of the input. However if you are\\ninterested, you can have a look at my solution.\\n\\nWe need to find small enough directories and also find the smallest directory that\\nwill free enough space. Now the question is, how could we do that. And there are\\nmultiple ways I will describe.\\n\\nI have chosen to implement [_tree catamorphism_] :weary:. It is basically a fold\\nover a tree data structure. We descent down into the leaves and propagate computed\\nresults all the way to the root. You can also notice that this approach is very\\nsimilar to _dynamic programming_, we find overlapping sections of the computation\\nand try to minimize the additional work (in this case: we need to know sizes of\\nour descendants, but we have already been there).\\n\\nAnother approach that has been suggested to me few days later is running DFS on\\nthe graph. And, funnily enough, we would still need to combine what we found in\\nthe branches where we descent. So in the end, it would work very similarly to my\\nsolution.\\n\\nOne of the more exotic options would be precomputing the required information at\\nthe same time as parsing. That could be done by adding additional fields to the\\nnodes which would allow storing such information and updating it as we construct\\nthe file system.\\n\\n## Post Mortem\\n\\nThings that have been brought up in the discussion later on.\\n\\n### `Rc` vs `Rc>`\\n\\nIt has been brought up that I have a contradicting statement regarding the\\ndynamically allocated memory. Specifically:\\n\\n- You can imagine `Rc` as an `std::shared_ptr` (in C++)\\n- When you want an equivalent of `std::shared_ptr`, you want to use\\n `Rc>`\\n\\nNow, in Rust it is a bit more complicated, because the type that represents the\\n\u201cshared pointer\u201d is `Rc`. What `RefCell` does is making sure that there is\\nonly one \u201cowner\u201d of a mutable reference at a time (and dynamically, as opposed\\nto the `Cell`).\\n\\nTherefore to be precise and correct about the equivalents of `std::shared_ptr`\\nin Rust, we can say that\\n\\n- `Rc` is an equivalent of a `const std::shared_ptr`,\\n- and `Rc>` is an equivalent of a `std::shared_ptr`.\\n\\nYou can easily see that they only differ in the mutability. (And even that is not\\nas simple as it seems, because there is also `Cell`)\\n\\n[_advent of code_]: https://adventofcode.com\\n[gitlab]: https://gitlab.com/mfocko/advent-of-code-2022\\n[`/src/bin/`]: https://gitlab.com/mfocko/advent-of-code-2022/-/tree/main/src/bin\\n[`sccache`]: https://github.com/mozilla/sccache\\n[`rangeinclusive`]: https://doc.rust-lang.org/std/ops/struct.RangeInclusive.html\\n[`split_off`]: https://doc.rust-lang.org/std/vec/struct.Vec.html#method.split_off\\n[`du`]: https://www.man7.org/linux/man-pages/man1/du.1.html\\n[`hashmap`]: https://doc.rust-lang.org/std/collections/struct.HashMap.html\\n[`btreemap`]: https://doc.rust-lang.org/std/collections/struct.BTreeMap.html\\n[_tree catamorphism_]: https://en.wikipedia.org/wiki/Catamorphism#Tree_fold"},{"id":"aoc-2022/intro","metadata":{"permalink":"/blog/aoc-2022/intro","editUrl":"https://github.com/mfocko/blog/tree/main/blog/aoc-2022/00-intro.md","source":"@site/blog/aoc-2022/00-intro.md","title":"Advent of Code \'22 in Rust","description":"Preparing for Advent of Code \'22.","date":"2022-12-14T21:45:00.000Z","formattedDate":"December 14, 2022","tags":[{"label":"advent-of-code","permalink":"/blog/tags/advent-of-code"},{"label":"advent-of-code-2022","permalink":"/blog/tags/advent-of-code-2022"},{"label":"rust","permalink":"/blog/tags/rust"}],"readingTime":8.665,"hasTruncateMarker":true,"authors":[{"name":"Matej Focko","email":"me+blog@mfocko.xyz","title":"a.k.a. @mf","url":"https://gitlab.com/mfocko","imageURL":"https://github.com/mfocko.png","key":"mf"}],"frontMatter":{"title":"Advent of Code \'22 in Rust","description":"Preparing for Advent of Code \'22.","date":"2022-12-14T21:45","slug":"aoc-2022/intro","authors":"mf","tags":["advent-of-code","advent-of-code-2022","rust"],"hide_table_of_contents":false},"unlisted":false,"prevItem":{"title":"1st week of Advent of Code \'22 in Rust","permalink":"/blog/aoc-2022/1st-week"}},"content":"Let\'s talk about the preparations for this year\'s [_Advent of Code_].\\n\\n\x3c!--truncate--\x3e\\n\\n## Choosing a language\\n\\nWhen choosing a language for AoC, you usually want a language that gives you a\\nquick feedback which allows you to iterate quickly to the solution of the puzzle.\\nOne of the most common choices is Python, many people also use JavaScript or Ruby.\\n\\nGiven the competitive nature of the AoC and popularity among competitive programming,\\nC++ might be also a very good choice. Only if you are familiar with it, I guess\u2026\\n\\nIf you want a challenge, you might also choose to rotate the languages each day.\\nThough I prefer to use only one language.\\n\\nFor this year I have been deciding between _Rust_, _C++_ and _Pascal_ or _Ada_.\\n\\nI have tried Rust last year and have survived with it for 3 days and then gave\\nup and switched to _Kotlin_, which was pretty good given it is \u201cJava undercover\u201d.\\nI pretty much like the ideas behind Rust, I am not sure about the whole cult and\\nimplementation of those ideas though. After some years with C/C++, I would say\\nthat Rust feels _too safe_ for my taste and tries to \u201c_punish me_\u201d even for the\\nmost trivial things.\\n\\nC++ is a very robust, but also comes with a wide variety of options providing you\\nthe ability to shoot yourself in the leg. I have tried to solve few days of previous\\nAdvent of Code events, it was _relatively easy_ to solve the problems in C++, given\\nthat I do not admit writing my own iterator for `enumerate`\u2026\\n\\nPascal or Ada were meme choices :) Ada is heavily inspired by Pascal and has a\\npretty nice standard library that offers enough to be able to quickly solve some\\nproblems in it. However the toolkit is questionable :/\\n\\n## Choosing libraries\\n\\n## Preparations for Rust\\n\\nAll of the sources, later on including solutions, can be found at my\\n[GitLab].\\n\\n### Toolkit\\n\\nSince we are using Rust, we are going to use a [Cargo] and more than likely VSCode\\nwith [`rust-analyzer`]. Because of my choice of libraries, we will also introduce\\na `.envrc` file that can be used by [`direnv`], which allows you to set specific\\nenvironment variables when you enter a directory. In our case, we will use\\n\\n```bash\\n# to show nice backtrace when using the color-eyre\\nexport RUST_BACKTRACE=1\\n\\n# to catch logs generated by tracing\\nexport RUST_LOG=trace\\n```\\n\\nAnd for the one of the most obnoxious things ever, we will use a script to download\\nthe inputs instead of \u201c_clicking, opening and copying to a file_\u201d[^1]. There is\\nno need to be _fancy_, so we will adjust Python script by Martin[^2].\\n\\n```py\\n#!/usr/bin/env python3\\n\\nimport datetime\\nimport yaml\\nimport requests\\nimport sys\\n\\n\\ndef load_config():\\n with open(\\"env.yaml\\", \\"r\\") as f:\\n js = yaml.load(f, Loader=yaml.Loader)\\n return js[\\"session\\"], js[\\"year\\"]\\n\\n\\ndef get_input(session, year, day):\\n return requests.get(\\n f\\"https://adventofcode.com/{year}/day/{day}/input\\",\\n cookies={\\"session\\": session},\\n headers={\\n \\"User-Agent\\": \\"{repo} by {mail}\\".format(\\n repo=\\"gitlab.com/mfocko/advent-of-code-2022\\",\\n mail=\\"me@mfocko.xyz\\",\\n )\\n },\\n ).content.decode(\\"utf-8\\")\\n\\n\\ndef main():\\n day = datetime.datetime.now().day\\n if len(sys.argv) == 2:\\n day = sys.argv[1]\\n\\n session, year = load_config()\\n problem_input = get_input(session, year, day)\\n\\n with open(f\\"./inputs/day{day:>02}.txt\\", \\"w\\") as f:\\n f.write(problem_input)\\n\\n\\nif __name__ == \\"__main__\\":\\n main()\\n```\\n\\nIf the script is called without any arguments, it will deduce the day from the\\nsystem, so we do not need to change the day every morning. It also requires a\\nconfiguration file:\\n\\n```yaml\\n# env.yaml\\nsession: \u2039your session cookie\u203a\\nyear: 2022\\n```\\n\\n### Libraries\\n\\nLooking at the list of the libraries, I have chosen \u201ca lot\u201d of them. Let\'s walk\\nthrough each of them.\\n\\n[`tracing`] and [`tracing-subscriber`] are the crates that can be used for tracing\\nand logging of your Rust programs, there are also other crates that can help you\\nwith providing backtrace to the Sentry in case you have deployed your application\\nsomewhere and you want to watch over it. In our use case we will just utilize the\\nmacros for debugging in the terminal.\\n\\n[`thiserror`], [`anyhow`] and [`color-eyre`] are used for error reporting.\\n`thiserror` is a very good choice for libraries, cause it extends the `Error`\\nfrom the `std` and allows you to create more convenient error types. Next is\\n`anyhow` which kinda builds on top of the `thiserror` and provides you with simpler\\nerror handling in binaries[^3]. And finally we have `color-eyre` which, as I found\\nout later, is a colorful (_wink wink_) extension of `eyre` which is fork of `anyhow`\\nwhile supporting customized reports.\\n\\nIn the end I have decided to remove `thiserror` and `anyhow`, since first one is\\nsuitable for libraries and the latter was basically fully replaced by `{color-,}eyre`.\\n\\n[`regex`] and [`lazy_static`] are a very good and also, I hope, self-explanatory\\ncombination. `lazy_static` allows you to have static variables that must be initialized\\nduring runtime.\\n\\n[`itertools`] provides some nice extensions to the iterators from the `std`.\\n\\n### My own \u201clibrary\u201d\\n\\nWhen creating the crate for this year\'s Advent of Code, I have chosen a library\\ntype. Even though standard library is huge, some things might not be included and\\nalso we can follow _KISS_. I have 2 modules that my \u201clibrary\u201d exports, one for\\nparsing and one for 2D vector (that gets used quite often during Advent of Code).\\n\\nKey part is, of course, processing the input and my library exports following\\nfunctions that get used a lot:\\n\\n```rust\\n/// Reads file to the string.\\npub fn file_to_string>(pathname: P) -> String;\\n\\n/// Reads file and returns it as a vector of characters.\\npub fn file_to_chars>(pathname: P) -> Vec;\\n\\n/// Reads file and returns a vector of parsed structures. Expects each structure\\n/// on its own line in the file. And `T` needs to implement `FromStr` trait.\\npub fn file_to_structs, T: FromStr>(pathname: P) -> Vec\\nwhere\\n ::Err: Debug;\\n\\n/// Converts iterator over strings to a vector of parsed structures. `T` needs\\n/// to implement `FromStr` trait and its error must derive `Debug` trait.\\npub fn strings_to_structs(\\n iter: impl Iterator\\n) -> Vec\\nwhere\\n ::Err: std::fmt::Debug,\\n U: Deref;\\n\\n/// Reads file and returns it as a vector of its lines.\\npub fn file_to_lines>(pathname: P) -> Vec;\\n```\\n\\nAs for the vector, I went with a rather simple implementation that allows only\\naddition of the vectors for now and accessing the elements via functions `x()`\\nand `y()`. Also the vector is generic, so we can use it with any numeric type we\\nneed.\\n\\n### Skeleton\\n\\nWe can also prepare a template to quickly bootstrap each of the days. We know\\nthat each puzzle has 2 parts, which means that we can start with 2 functions that\\nwill solve them.\\n\\n```rust\\nfn part1(input: &Input) -> Output {\\n todo!()\\n}\\n\\nfn part2(input: &Input) -> Output {\\n todo!()\\n}\\n```\\n\\nBoth functions take reference to the input and return some output (in majority\\nof puzzles, it is the same type). `todo!()` can be used as a nice placeholder,\\nit also causes a panic when reached and we could also provide some string with\\nan explanation, e.g. `todo!(\\"part 1\\")`. We have not given functions a specific\\ntype and to avoid as much copy-paste as possible, we will introduce type aliases.\\n\\n```rust\\ntype Input = String;\\ntype Output = i32;\\n```\\n\\n:::tip\\n\\nThis allows us to quickly adjust the types only in one place without the need to\\ndo _regex-replace_ or replace them manually.\\n\\n:::\\n\\nFor each day we get a personalized input that is provided as a text file. Almost\\nall the time, we would like to get some structured type out of that input, and\\ntherefore it makes sense to introduce a new function that will provide the parsing\\nof the input.\\n\\n```rust\\nfn parse_input(path: &str) -> Input {\\n todo!()\\n}\\n```\\n\\nThis \u201cparser\u201d will take a path to the file, just in case we would like to run the\\nsample instead of input.\\n\\nOK, so now we can write a `main` function that will take all of the pieces and\\nrun them.\\n\\n```rust\\nfn main() {\\n let input = parse_input(\\"inputs/dayXX.txt\\");\\n\\n println!(\\"Part 1: {}\\", part_1(&input));\\n println!(\\"Part 2: {}\\", part_2(&input));\\n}\\n```\\n\\nThis would definitely do :) But we have installed a few libraries and we want to\\nuse them. In this part we are going to utilize _[`tracing`]_ (for tracing, duh\u2026)\\nand _[`color-eyre`]_ (for better error reporting, e.g. from parsing).\\n\\n```rust\\nfn main() -> Result<()> {\\n tracing_subscriber::fmt()\\n .with_env_filter(EnvFilter::from_default_env())\\n .with_target(false)\\n .with_file(true)\\n .with_line_number(true)\\n .without_time()\\n .compact()\\n .init();\\n color_eyre::install()?;\\n\\n let input = parse_input(\\"inputs/dayXX.txt\\");\\n\\n info!(\\"Part 1: {}\\", part_1(&input));\\n info!(\\"Part 2: {}\\", part_2(&input));\\n\\n Ok(())\\n}\\n```\\n\\nThe first statement will set up tracing and configure it to print out the logs to\\nterminal, based on the environment variable. We also change the formatting a bit,\\nsince we do not need all the _fancy_ features of the logger. Pure initialization\\nwould get us logs like this:\\n\\n```\\n2022-12-11T19:53:19.975343Z INFO day01: Part 1: 0\\n```\\n\\nHowever after running that command, we will get the following:\\n\\n```\\n INFO src/bin/day01.rs:35: Part 1: 0\\n```\\n\\nAnd the `color_eyre::install()?` is quite straightforward. We just initialize the\\nerror reporting by _color eyre_.\\n\\n:::caution\\n\\nNotice that we had to add `Ok(())` to the end of the function and adjust the\\nreturn type of the `main` to `Result<()>`. It is caused by the _color eyre_ that\\ncan be installed only once and therefore it can fail, that is how we got the `?`\\nat the end of the `::install` which _unwraps_ the **\xbbresult\xab** of the installation.\\n\\n:::\\n\\nOverall we will get to a template like this:\\n\\n```rust\\nuse aoc_2022::*;\\n\\nuse color_eyre::eyre::Result;\\nuse tracing::info;\\nuse tracing_subscriber::EnvFilter;\\n\\ntype Input = String;\\ntype Output = i32;\\n\\nfn parse_input(path: &str) -> Input {\\n todo!()\\n}\\n\\nfn part1(input: &Input) -> Output {\\n todo!()\\n}\\n\\nfn part2(input: &Input) -> Output {\\n todo!()\\n}\\n\\nfn main() -> Result<()> {\\n tracing_subscriber::fmt()\\n .with_env_filter(EnvFilter::from_default_env())\\n .with_target(false)\\n .with_file(true)\\n .with_line_number(true)\\n .without_time()\\n .compact()\\n .init();\\n color_eyre::install()?;\\n\\n let input = parse_input(\\"inputs/dayXX.txt\\");\\n\\n info!(\\"Part 1: {}\\", part_1(&input));\\n info!(\\"Part 2: {}\\", part_2(&input));\\n\\n Ok(())\\n}\\n```\\n\\n[^1]:\\n Copy-pasting might be a relaxing thing to do, but you can also discover\\n nasty stuff about your PC. See [this Reddit post and the comment].\\n\\n[^2]: [GitHub profile](https://github.com/martinjonas)\\n[^3]:\\n Even though you can use it even for libraries, but handling errors from\\n libraries using `anyhow` is nasty\u2026 You will be the stinky one ;)\\n\\n[_advent of code_]: https://adventofcode.com\\n[gitlab]: https://gitlab.com/mfocko/advent-of-code-2022\\n[cargo]: https://doc.rust-lang.org/cargo/\\n[`rust-analyzer`]: https://rust-analyzer.github.io/\\n[`direnv`]: https://direnv.net/\\n[`tracing`]: https://crates.io/crates/tracing\\n[`tracing-subscriber`]: https://crates.io/crates/tracing-subscriber\\n[`thiserror`]: https://crates.io/crates/thiserror\\n[`anyhow`]: https://crates.io/crates/anyhow\\n[`color-eyre`]: https://crates.io/crates/color-eyre\\n[`regex`]: https://crates.io/crates/regex\\n[`lazy_static`]: https://crates.io/crates/lazy_static\\n[`itertools`]: https://crates.io/crates/itertools\\n[this reddit post and the comment]: https://www.reddit.com/r/adventofcode/comments/zb98pn/comment/iyq0ono"}]}')}}]); \ No newline at end of file diff --git a/assets/js/595c7293.fd69129b.js b/assets/js/595c7293.6b6976ac.js similarity index 99% rename from assets/js/595c7293.fd69129b.js rename to assets/js/595c7293.6b6976ac.js index e1fb5dd..defbf2c 100644 --- a/assets/js/595c7293.fd69129b.js +++ b/assets/js/595c7293.6b6976ac.js @@ -1 +1 @@ -"use strict";(self.webpackChunkfi=self.webpackChunkfi||[]).push([[5634],{58396:(e,n,t)=>{t.r(n),t.d(n,{assets:()=>a,contentTitle:()=>o,default:()=>h,frontMatter:()=>r,metadata:()=>c,toc:()=>l});var i=t(85893),s=t(11151);const r={id:"seminar-08",title:"8th seminar",description:"Manipulating with files only char-by-char and a magic tree.\n"},o="8th seminar bonus assignment",c={id:"bonuses/seminar-08",title:"8th seminar",description:"Manipulating with files only char-by-char and a magic tree.\n",source:"@site/c/bonuses/08.md",sourceDirName:"bonuses",slug:"/bonuses/seminar-08",permalink:"/c/bonuses/seminar-08",draft:!1,unlisted:!1,editUrl:"https://github.com/mfocko/blog/tree/main/c/bonuses/08.md",tags:[],version:"current",lastUpdatedAt:1718801795,formattedLastUpdatedAt:"Jun 19, 2024",frontMatter:{id:"seminar-08",title:"8th seminar",description:"Manipulating with files only char-by-char and a magic tree.\n"},sidebar:"autogeneratedBar",previous:{title:"5th and 6th seminar",permalink:"/c/bonuses/seminar-05-06"},next:{title:"10th seminar",permalink:"/c/bonuses/seminar-10"}},a={},l=[{value:"Introduction",id:"introduction",level:2},{value:"Warning",id:"warning",level:2},{value:"Testing",id:"testing",level:2},{value:"Task no. 1: Counting (0.75 K\u20a1)",id:"task-no-1-counting-075-k",level:2},{value:"Requirements",id:"requirements",level:3},{value:"Bonus part (0.75 K\u20a1)",id:"bonus-part-075-k",level:3},{value:"Task no. 2: Weird trees (1 K\u20a1)",id:"task-no-2-weird-trees-1-k",level:2},{value:"Submitting",id:"submitting",level:2}];function d(e){const n={a:"a",blockquote:"blockquote",code:"code",em:"em",h1:"h1",h2:"h2",h3:"h3",hr:"hr",img:"img",li:"li",ol:"ol",p:"p",pre:"pre",strong:"strong",ul:"ul",...(0,s.a)(),...e.components};return(0,i.jsxs)(i.Fragment,{children:[(0,i.jsx)(n.h1,{id:"8th-seminar-bonus-assignment",children:"8th seminar bonus assignment"}),"\n",(0,i.jsx)(n.p,{children:(0,i.jsx)(n.a,{href:"pathname:///files/c/bonuses/08.tar.gz",children:"Source"})}),"\n",(0,i.jsx)(n.h2,{id:"introduction",children:"Introduction"}),"\n",(0,i.jsx)(n.p,{children:"In this bonus you can implement two tasks, one of them has a bonus part with generic\nsolution."}),"\n",(0,i.jsx)(n.p,{children:"One is focused on counting ananas or in case of generic version any substring in\nthe file, but with a restriction on the function you use."}),"\n",(0,i.jsx)(n.p,{children:"Other one has a more algorithmic spirit."}),"\n",(0,i.jsx)(n.p,{children:"For this bonus you can get at maximum 2.5 K\u20a1."}),"\n",(0,i.jsx)(n.h2,{id:"warning",children:"Warning"}),"\n",(0,i.jsxs)(n.p,{children:[(0,i.jsx)(n.strong,{children:"DO NOT COMMIT test data"})," to your own git repository, since the tests include\nfiles that exceed 10MB by themselves. Even if they are on separate branch, they\ntake up the space."]}),"\n",(0,i.jsx)(n.h2,{id:"testing",children:"Testing"}),"\n",(0,i.jsxs)(n.p,{children:["For testing you are provided with python script (requires ",(0,i.jsx)(n.code,{children:"click"})," to be installed:\n",(0,i.jsx)(n.code,{children:"pip3 install --user click"}),") and ",(0,i.jsx)(n.code,{children:"Makefile"})," that provides following targets:"]}),"\n",(0,i.jsxs)(n.ul,{children:["\n",(0,i.jsxs)(n.li,{children:[(0,i.jsx)(n.code,{children:"check-counting"})," - runs the ",(0,i.jsx)(n.code,{children:"counting"})," tests"]}),"\n",(0,i.jsxs)(n.li,{children:[(0,i.jsx)(n.code,{children:"check-counting-bonus"})," - runs the ",(0,i.jsx)(n.code,{children:"counting"})," tests with bonus implemented"]}),"\n",(0,i.jsxs)(n.li,{children:[(0,i.jsx)(n.code,{children:"check"})," - runs both ",(0,i.jsx)(n.code,{children:"counting"})," and ",(0,i.jsx)(n.code,{children:"counting-bonus"})," tests"]}),"\n",(0,i.jsxs)(n.li,{children:[(0,i.jsx)(n.code,{children:"clean"})," - removes output files from the test runs"]}),"\n"]}),"\n",(0,i.jsx)(n.h2,{id:"task-no-1-counting-075-k",children:"Task no. 1: Counting (0.75 K\u20a1)"}),"\n",(0,i.jsx)(n.p,{children:"Your first task is to make smallish program that counts occurences of specific\n(or given) word from file and writes the number to other file."}),"\n",(0,i.jsx)(n.p,{children:"Usage of the program is:"}),"\n",(0,i.jsx)(n.pre,{children:(0,i.jsx)(n.code,{children:"Usage: ./counting [string-to-be-counted]\n"})}),"\n",(0,i.jsx)(n.p,{children:"Arguments that are passed to the program represent:"}),"\n",(0,i.jsxs)(n.ul,{children:["\n",(0,i.jsxs)(n.li,{children:[(0,i.jsx)(n.code,{children:""})," - path to the file where we count the words"]}),"\n",(0,i.jsxs)(n.li,{children:[(0,i.jsx)(n.code,{children:""})," - path to the file where we output the count"]}),"\n",(0,i.jsxs)(n.li,{children:["(optional argument) ",(0,i.jsx)(n.code,{children:"[string-to-be-counted]"})," - in case you implement bonus,\notherwise we default to word ",(0,i.jsx)(n.code,{children:"ananas"})," ;)"]}),"\n"]}),"\n",(0,i.jsx)(n.p,{children:"In skeleton you are given 3 empty, but documented, functions to implement."}),"\n",(0,i.jsxs)(n.ol,{children:["\n",(0,i.jsxs)(n.li,{children:[(0,i.jsx)(n.code,{children:"count_anything"})," - function accepts input file and substring to be counted in\nthe file, returns the count."]}),"\n",(0,i.jsxs)(n.li,{children:[(0,i.jsx)(n.code,{children:"count_ananas"})," - same as ",(0,i.jsx)(n.code,{children:"count_anything"}),", but specialized for ananases, the\ndefault implementation from the skeleton expects you to implement ",(0,i.jsx)(n.code,{children:"count_anything"}),"\nand therefore it just calls the other function."]}),"\n",(0,i.jsxs)(n.li,{children:[(0,i.jsx)(n.code,{children:"write_number"})," - function that writes the number to the file, why would you\nneed the function is explained later :)"]}),"\n"]}),"\n",(0,i.jsx)(n.h3,{id:"requirements",children:"Requirements"}),"\n",(0,i.jsxs)(n.p,{children:["For manipulation with the files you are only allowed to use ",(0,i.jsx)(n.code,{children:"fopen"}),", ",(0,i.jsx)(n.code,{children:"fclose"}),",\n",(0,i.jsx)(n.code,{children:"fgetc"})," and ",(0,i.jsx)(n.code,{children:"fputc"}),". Functions like ",(0,i.jsx)(n.code,{children:"fprintf"})," (except for ",(0,i.jsx)(n.code,{children:"stderr"})," or logging) and\n",(0,i.jsx)(n.code,{children:"fscanf"})," are ",(0,i.jsx)(n.strong,{children:"forbidden"}),"."]}),"\n",(0,i.jsx)(n.p,{children:"In case you struggle and want to use one of those functions, the solution will be\npenalized by 50% of points."}),"\n",(0,i.jsx)(n.h3,{id:"bonus-part-075-k",children:"Bonus part (0.75 K\u20a1)"}),"\n",(0,i.jsxs)(n.p,{children:["Bonus part of this assignment is to implement ",(0,i.jsx)(n.code,{children:"count_anything"})," rather than ",(0,i.jsx)(n.code,{children:"count_ananas"}),"."]}),"\n",(0,i.jsxs)(n.blockquote,{children:["\n",(0,i.jsx)(n.p,{children:"Smaller hint: This task does not need dynamic allocation :) You just need one\ngood helper function and the right idea ;)"}),"\n"]}),"\n",(0,i.jsx)(n.h2,{id:"task-no-2-weird-trees-1-k",children:"Task no. 2: Weird trees (1 K\u20a1)"}),"\n",(0,i.jsxs)(n.p,{children:["In this task we are crossing our paths with ",(0,i.jsx)(n.em,{children:"algorithms and data structures"}),".\nYour task is to write a program that constructs tree from the file that is given\nas an argument and pretty-prints it."]}),"\n",(0,i.jsxs)(n.p,{children:["Input file consists of lines, that include ",(0,i.jsx)(n.code,{children:"key"})," and ",(0,i.jsx)(n.code,{children:"rank"})," in form ",(0,i.jsx)(n.code,{children:"key;rank"}),"\nor ",(0,i.jsx)(n.code,{children:"nil"}),". Why would we have ",(0,i.jsx)(n.code,{children:"nil"})," in a file? The file represents pre-order iteration\nthrough the tree. Leaves never have rank different than 0, so you can safely assume\n2 non-existing ",(0,i.jsx)(n.code,{children:"nil"}),"s in the input after you read such node ;)"]}),"\n",(0,i.jsxs)("table",{children:[(0,i.jsxs)("tr",{children:[(0,i.jsx)("th",{children:"Example input file"}),(0,i.jsx)("th",{children:"Tree it represents"})]}),(0,i.jsxs)("tr",{children:[(0,i.jsx)("td",{children:(0,i.jsx)(n.pre,{children:(0,i.jsx)(n.code,{children:"8;4\n5;3\n3;2\n2;1\n1;0\nnil\n4;0\n7;1\n6;0\nnil\n11;2\n10;1\n9;0\nnil\n12;0\n"})})}),(0,i.jsx)("td",{children:(0,i.jsx)(n.p,{children:(0,i.jsx)(n.img,{alt:"tree",src:t(30073).Z+"",width:"633",height:"684"})})})]})]}),"\n",(0,i.jsxs)(n.p,{children:["In this task you are only provided with different trees in the ",(0,i.jsx)(n.code,{children:"test-trees"})," directory.\nImplementation and format of the pretty-print is totally up to you. :)"]}),"\n",(0,i.jsx)(n.p,{children:"Example of mine for the tree above:"}),"\n",(0,i.jsx)(n.pre,{children:(0,i.jsx)(n.code,{children:"8 (rank = 4)\n+-- 5 (rank = 3)\n| +-- 3 (rank = 2)\n| | +-- 2 (rank = 1)\n| | | +-- 1 (rank = 0)\n| | +-- 4 (rank = 0)\n| +-- 7 (rank = 1)\n| +-- 6 (rank = 0)\n+-- 11 (rank = 2)\n +-- 10 (rank = 1)\n | +-- 9 (rank = 0)\n +-- 12 (rank = 0)\n"})}),"\n",(0,i.jsxs)(n.blockquote,{children:["\n",(0,i.jsx)(n.p,{children:"Can you find out what are those trees? :)"}),"\n"]}),"\n",(0,i.jsx)(n.h2,{id:"submitting",children:"Submitting"}),"\n",(0,i.jsx)(n.p,{children:"In case you have any questions, feel free to reach out to me."}),"\n",(0,i.jsx)(n.hr,{})]})}function h(e={}){const{wrapper:n}={...(0,s.a)(),...e.components};return n?(0,i.jsx)(n,{...e,children:(0,i.jsx)(d,{...e})}):d(e)}},30073:(e,n,t)=>{t.d(n,{Z:()=>i});const i=t.p+"assets/images/tree-c9e37f87f9095c00fad33ea034485ce6.png"},11151:(e,n,t)=>{t.d(n,{Z:()=>c,a:()=>o});var i=t(67294);const s={},r=i.createContext(s);function o(e){const n=i.useContext(r);return i.useMemo((function(){return"function"==typeof e?e(n):{...n,...e}}),[n,e])}function c(e){let n;return n=e.disableParentContext?"function"==typeof e.components?e.components(s):e.components||s:o(e.components),i.createElement(r.Provider,{value:n},e.children)}}}]); \ No newline at end of file +"use strict";(self.webpackChunkfi=self.webpackChunkfi||[]).push([[5634],{58396:(e,n,t)=>{t.r(n),t.d(n,{assets:()=>a,contentTitle:()=>o,default:()=>h,frontMatter:()=>r,metadata:()=>c,toc:()=>l});var i=t(85893),s=t(11151);const r={id:"seminar-08",title:"8th seminar",description:"Manipulating with files only char-by-char and a magic tree.\n"},o="8th seminar bonus assignment",c={id:"bonuses/seminar-08",title:"8th seminar",description:"Manipulating with files only char-by-char and a magic tree.\n",source:"@site/c/bonuses/08.md",sourceDirName:"bonuses",slug:"/bonuses/seminar-08",permalink:"/c/bonuses/seminar-08",draft:!1,unlisted:!1,editUrl:"https://github.com/mfocko/blog/tree/main/c/bonuses/08.md",tags:[],version:"current",lastUpdatedAt:1718813674,formattedLastUpdatedAt:"Jun 19, 2024",frontMatter:{id:"seminar-08",title:"8th seminar",description:"Manipulating with files only char-by-char and a magic tree.\n"},sidebar:"autogeneratedBar",previous:{title:"5th and 6th seminar",permalink:"/c/bonuses/seminar-05-06"},next:{title:"10th seminar",permalink:"/c/bonuses/seminar-10"}},a={},l=[{value:"Introduction",id:"introduction",level:2},{value:"Warning",id:"warning",level:2},{value:"Testing",id:"testing",level:2},{value:"Task no. 1: Counting (0.75 K\u20a1)",id:"task-no-1-counting-075-k",level:2},{value:"Requirements",id:"requirements",level:3},{value:"Bonus part (0.75 K\u20a1)",id:"bonus-part-075-k",level:3},{value:"Task no. 2: Weird trees (1 K\u20a1)",id:"task-no-2-weird-trees-1-k",level:2},{value:"Submitting",id:"submitting",level:2}];function d(e){const n={a:"a",blockquote:"blockquote",code:"code",em:"em",h1:"h1",h2:"h2",h3:"h3",hr:"hr",img:"img",li:"li",ol:"ol",p:"p",pre:"pre",strong:"strong",ul:"ul",...(0,s.a)(),...e.components};return(0,i.jsxs)(i.Fragment,{children:[(0,i.jsx)(n.h1,{id:"8th-seminar-bonus-assignment",children:"8th seminar bonus assignment"}),"\n",(0,i.jsx)(n.p,{children:(0,i.jsx)(n.a,{href:"pathname:///files/c/bonuses/08.tar.gz",children:"Source"})}),"\n",(0,i.jsx)(n.h2,{id:"introduction",children:"Introduction"}),"\n",(0,i.jsx)(n.p,{children:"In this bonus you can implement two tasks, one of them has a bonus part with generic\nsolution."}),"\n",(0,i.jsx)(n.p,{children:"One is focused on counting ananas or in case of generic version any substring in\nthe file, but with a restriction on the function you use."}),"\n",(0,i.jsx)(n.p,{children:"Other one has a more algorithmic spirit."}),"\n",(0,i.jsx)(n.p,{children:"For this bonus you can get at maximum 2.5 K\u20a1."}),"\n",(0,i.jsx)(n.h2,{id:"warning",children:"Warning"}),"\n",(0,i.jsxs)(n.p,{children:[(0,i.jsx)(n.strong,{children:"DO NOT COMMIT test data"})," to your own git repository, since the tests include\nfiles that exceed 10MB by themselves. Even if they are on separate branch, they\ntake up the space."]}),"\n",(0,i.jsx)(n.h2,{id:"testing",children:"Testing"}),"\n",(0,i.jsxs)(n.p,{children:["For testing you are provided with python script (requires ",(0,i.jsx)(n.code,{children:"click"})," to be installed:\n",(0,i.jsx)(n.code,{children:"pip3 install --user click"}),") and ",(0,i.jsx)(n.code,{children:"Makefile"})," that provides following targets:"]}),"\n",(0,i.jsxs)(n.ul,{children:["\n",(0,i.jsxs)(n.li,{children:[(0,i.jsx)(n.code,{children:"check-counting"})," - runs the ",(0,i.jsx)(n.code,{children:"counting"})," tests"]}),"\n",(0,i.jsxs)(n.li,{children:[(0,i.jsx)(n.code,{children:"check-counting-bonus"})," - runs the ",(0,i.jsx)(n.code,{children:"counting"})," tests with bonus implemented"]}),"\n",(0,i.jsxs)(n.li,{children:[(0,i.jsx)(n.code,{children:"check"})," - runs both ",(0,i.jsx)(n.code,{children:"counting"})," and ",(0,i.jsx)(n.code,{children:"counting-bonus"})," tests"]}),"\n",(0,i.jsxs)(n.li,{children:[(0,i.jsx)(n.code,{children:"clean"})," - removes output files from the test runs"]}),"\n"]}),"\n",(0,i.jsx)(n.h2,{id:"task-no-1-counting-075-k",children:"Task no. 1: Counting (0.75 K\u20a1)"}),"\n",(0,i.jsx)(n.p,{children:"Your first task is to make smallish program that counts occurences of specific\n(or given) word from file and writes the number to other file."}),"\n",(0,i.jsx)(n.p,{children:"Usage of the program is:"}),"\n",(0,i.jsx)(n.pre,{children:(0,i.jsx)(n.code,{children:"Usage: ./counting [string-to-be-counted]\n"})}),"\n",(0,i.jsx)(n.p,{children:"Arguments that are passed to the program represent:"}),"\n",(0,i.jsxs)(n.ul,{children:["\n",(0,i.jsxs)(n.li,{children:[(0,i.jsx)(n.code,{children:""})," - path to the file where we count the words"]}),"\n",(0,i.jsxs)(n.li,{children:[(0,i.jsx)(n.code,{children:""})," - path to the file where we output the count"]}),"\n",(0,i.jsxs)(n.li,{children:["(optional argument) ",(0,i.jsx)(n.code,{children:"[string-to-be-counted]"})," - in case you implement bonus,\notherwise we default to word ",(0,i.jsx)(n.code,{children:"ananas"})," ;)"]}),"\n"]}),"\n",(0,i.jsx)(n.p,{children:"In skeleton you are given 3 empty, but documented, functions to implement."}),"\n",(0,i.jsxs)(n.ol,{children:["\n",(0,i.jsxs)(n.li,{children:[(0,i.jsx)(n.code,{children:"count_anything"})," - function accepts input file and substring to be counted in\nthe file, returns the count."]}),"\n",(0,i.jsxs)(n.li,{children:[(0,i.jsx)(n.code,{children:"count_ananas"})," - same as ",(0,i.jsx)(n.code,{children:"count_anything"}),", but specialized for ananases, the\ndefault implementation from the skeleton expects you to implement ",(0,i.jsx)(n.code,{children:"count_anything"}),"\nand therefore it just calls the other function."]}),"\n",(0,i.jsxs)(n.li,{children:[(0,i.jsx)(n.code,{children:"write_number"})," - function that writes the number to the file, why would you\nneed the function is explained later :)"]}),"\n"]}),"\n",(0,i.jsx)(n.h3,{id:"requirements",children:"Requirements"}),"\n",(0,i.jsxs)(n.p,{children:["For manipulation with the files you are only allowed to use ",(0,i.jsx)(n.code,{children:"fopen"}),", ",(0,i.jsx)(n.code,{children:"fclose"}),",\n",(0,i.jsx)(n.code,{children:"fgetc"})," and ",(0,i.jsx)(n.code,{children:"fputc"}),". Functions like ",(0,i.jsx)(n.code,{children:"fprintf"})," (except for ",(0,i.jsx)(n.code,{children:"stderr"})," or logging) and\n",(0,i.jsx)(n.code,{children:"fscanf"})," are ",(0,i.jsx)(n.strong,{children:"forbidden"}),"."]}),"\n",(0,i.jsx)(n.p,{children:"In case you struggle and want to use one of those functions, the solution will be\npenalized by 50% of points."}),"\n",(0,i.jsx)(n.h3,{id:"bonus-part-075-k",children:"Bonus part (0.75 K\u20a1)"}),"\n",(0,i.jsxs)(n.p,{children:["Bonus part of this assignment is to implement ",(0,i.jsx)(n.code,{children:"count_anything"})," rather than ",(0,i.jsx)(n.code,{children:"count_ananas"}),"."]}),"\n",(0,i.jsxs)(n.blockquote,{children:["\n",(0,i.jsx)(n.p,{children:"Smaller hint: This task does not need dynamic allocation :) You just need one\ngood helper function and the right idea ;)"}),"\n"]}),"\n",(0,i.jsx)(n.h2,{id:"task-no-2-weird-trees-1-k",children:"Task no. 2: Weird trees (1 K\u20a1)"}),"\n",(0,i.jsxs)(n.p,{children:["In this task we are crossing our paths with ",(0,i.jsx)(n.em,{children:"algorithms and data structures"}),".\nYour task is to write a program that constructs tree from the file that is given\nas an argument and pretty-prints it."]}),"\n",(0,i.jsxs)(n.p,{children:["Input file consists of lines, that include ",(0,i.jsx)(n.code,{children:"key"})," and ",(0,i.jsx)(n.code,{children:"rank"})," in form ",(0,i.jsx)(n.code,{children:"key;rank"}),"\nor ",(0,i.jsx)(n.code,{children:"nil"}),". Why would we have ",(0,i.jsx)(n.code,{children:"nil"})," in a file? The file represents pre-order iteration\nthrough the tree. Leaves never have rank different than 0, so you can safely assume\n2 non-existing ",(0,i.jsx)(n.code,{children:"nil"}),"s in the input after you read such node ;)"]}),"\n",(0,i.jsxs)("table",{children:[(0,i.jsxs)("tr",{children:[(0,i.jsx)("th",{children:"Example input file"}),(0,i.jsx)("th",{children:"Tree it represents"})]}),(0,i.jsxs)("tr",{children:[(0,i.jsx)("td",{children:(0,i.jsx)(n.pre,{children:(0,i.jsx)(n.code,{children:"8;4\n5;3\n3;2\n2;1\n1;0\nnil\n4;0\n7;1\n6;0\nnil\n11;2\n10;1\n9;0\nnil\n12;0\n"})})}),(0,i.jsx)("td",{children:(0,i.jsx)(n.p,{children:(0,i.jsx)(n.img,{alt:"tree",src:t(30073).Z+"",width:"633",height:"684"})})})]})]}),"\n",(0,i.jsxs)(n.p,{children:["In this task you are only provided with different trees in the ",(0,i.jsx)(n.code,{children:"test-trees"})," directory.\nImplementation and format of the pretty-print is totally up to you. :)"]}),"\n",(0,i.jsx)(n.p,{children:"Example of mine for the tree above:"}),"\n",(0,i.jsx)(n.pre,{children:(0,i.jsx)(n.code,{children:"8 (rank = 4)\n+-- 5 (rank = 3)\n| +-- 3 (rank = 2)\n| | +-- 2 (rank = 1)\n| | | +-- 1 (rank = 0)\n| | +-- 4 (rank = 0)\n| +-- 7 (rank = 1)\n| +-- 6 (rank = 0)\n+-- 11 (rank = 2)\n +-- 10 (rank = 1)\n | +-- 9 (rank = 0)\n +-- 12 (rank = 0)\n"})}),"\n",(0,i.jsxs)(n.blockquote,{children:["\n",(0,i.jsx)(n.p,{children:"Can you find out what are those trees? :)"}),"\n"]}),"\n",(0,i.jsx)(n.h2,{id:"submitting",children:"Submitting"}),"\n",(0,i.jsx)(n.p,{children:"In case you have any questions, feel free to reach out to me."}),"\n",(0,i.jsx)(n.hr,{})]})}function h(e={}){const{wrapper:n}={...(0,s.a)(),...e.components};return n?(0,i.jsx)(n,{...e,children:(0,i.jsx)(d,{...e})}):d(e)}},30073:(e,n,t)=>{t.d(n,{Z:()=>i});const i=t.p+"assets/images/tree-c9e37f87f9095c00fad33ea034485ce6.png"},11151:(e,n,t)=>{t.d(n,{Z:()=>c,a:()=>o});var i=t(67294);const s={},r=i.createContext(s);function o(e){const n=i.useContext(r);return i.useMemo((function(){return"function"==typeof e?e(n):{...n,...e}}),[n,e])}function c(e){let n;return n=e.disableParentContext?"function"==typeof e.components?e.components(s):e.components||s:o(e.components),i.createElement(r.Provider,{value:n},e.children)}}}]); \ No newline at end of file diff --git a/assets/js/7052c0bc.81ee0b8a.js b/assets/js/7052c0bc.c6270640.js similarity index 95% rename from assets/js/7052c0bc.81ee0b8a.js rename to assets/js/7052c0bc.c6270640.js index b52fe4d..eaaff1c 100644 --- a/assets/js/7052c0bc.81ee0b8a.js +++ b/assets/js/7052c0bc.c6270640.js @@ -1 +1 @@ -"use strict";(self.webpackChunkfi=self.webpackChunkfi||[]).push([[9731],{42286:(t,e,n)=>{n.r(e),n.d(e,{assets:()=>a,contentTitle:()=>c,default:()=>d,frontMatter:()=>i,metadata:()=>s,toc:()=>p});var o=n(85893),r=n(11151);const i={id:"cpp-intro",title:"Introduction",slug:"/"},c=void 0,s={id:"cpp-intro",title:"Introduction",description:"",source:"@site/cpp/00-intro.md",sourceDirName:".",slug:"/",permalink:"/cpp/",draft:!1,unlisted:!1,editUrl:"https://github.com/mfocko/blog/tree/main/cpp/00-intro.md",tags:[],version:"current",lastUpdatedAt:1718801795,formattedLastUpdatedAt:"Jun 19, 2024",sidebarPosition:0,frontMatter:{id:"cpp-intro",title:"Introduction",slug:"/"},sidebar:"autogeneratedBar",next:{title:"Exceptions and RAII",permalink:"/cpp/category/exceptions-and-raii"}},a={},p=[];function u(t){return(0,o.jsx)(o.Fragment,{})}function d(t={}){const{wrapper:e}={...(0,r.a)(),...t.components};return e?(0,o.jsx)(e,{...t,children:(0,o.jsx)(u,{...t})}):u()}},11151:(t,e,n)=>{n.d(e,{Z:()=>s,a:()=>c});var o=n(67294);const r={},i=o.createContext(r);function c(t){const e=o.useContext(i);return o.useMemo((function(){return"function"==typeof t?t(e):{...e,...t}}),[e,t])}function s(t){let e;return e=t.disableParentContext?"function"==typeof t.components?t.components(r):t.components||r:c(t.components),o.createElement(i.Provider,{value:e},t.children)}}}]); \ No newline at end of file +"use strict";(self.webpackChunkfi=self.webpackChunkfi||[]).push([[9731],{42286:(t,e,n)=>{n.r(e),n.d(e,{assets:()=>a,contentTitle:()=>c,default:()=>d,frontMatter:()=>i,metadata:()=>s,toc:()=>p});var o=n(85893),r=n(11151);const i={id:"cpp-intro",title:"Introduction",slug:"/"},c=void 0,s={id:"cpp-intro",title:"Introduction",description:"",source:"@site/cpp/00-intro.md",sourceDirName:".",slug:"/",permalink:"/cpp/",draft:!1,unlisted:!1,editUrl:"https://github.com/mfocko/blog/tree/main/cpp/00-intro.md",tags:[],version:"current",lastUpdatedAt:1718813674,formattedLastUpdatedAt:"Jun 19, 2024",sidebarPosition:0,frontMatter:{id:"cpp-intro",title:"Introduction",slug:"/"},sidebar:"autogeneratedBar",next:{title:"Exceptions and RAII",permalink:"/cpp/category/exceptions-and-raii"}},a={},p=[];function u(t){return(0,o.jsx)(o.Fragment,{})}function d(t={}){const{wrapper:e}={...(0,r.a)(),...t.components};return e?(0,o.jsx)(e,{...t,children:(0,o.jsx)(u,{...t})}):u()}},11151:(t,e,n)=>{n.d(e,{Z:()=>s,a:()=>c});var o=n(67294);const r={},i=o.createContext(r);function c(t){const e=o.useContext(i);return o.useMemo((function(){return"function"==typeof t?t(e):{...e,...t}}),[e,t])}function s(t){let e;return e=t.disableParentContext?"function"==typeof t.components?t.components(r):t.components||r:c(t.components),o.createElement(i.Provider,{value:e},t.children)}}}]); \ No newline at end of file diff --git a/assets/js/794ef108.4d1fd72a.js b/assets/js/794ef108.7d5bcd90.js similarity index 95% rename from assets/js/794ef108.4d1fd72a.js rename to assets/js/794ef108.7d5bcd90.js index e74bc92..0f93a83 100644 --- a/assets/js/794ef108.4d1fd72a.js +++ b/assets/js/794ef108.7d5bcd90.js @@ -1 +1 @@ -"use strict";(self.webpackChunkfi=self.webpackChunkfi||[]).push([[3803],{86427:(t,e,n)=>{n.r(e),n.d(e,{assets:()=>a,contentTitle:()=>s,default:()=>l,frontMatter:()=>i,metadata:()=>c,toc:()=>u});var o=n(85893),r=n(11151);const i={id:"c-intro",title:"Introduction",slug:"/"},s=void 0,c={id:"c-intro",title:"Introduction",description:"",source:"@site/c/00-intro.md",sourceDirName:".",slug:"/",permalink:"/c/",draft:!1,unlisted:!1,editUrl:"https://github.com/mfocko/blog/tree/main/c/00-intro.md",tags:[],version:"current",lastUpdatedAt:1718801795,formattedLastUpdatedAt:"Jun 19, 2024",sidebarPosition:0,frontMatter:{id:"c-intro",title:"Introduction",slug:"/"},sidebar:"autogeneratedBar",next:{title:"Bonuses",permalink:"/c/category/bonuses"}},a={},u=[];function d(t){return(0,o.jsx)(o.Fragment,{})}function l(t={}){const{wrapper:e}={...(0,r.a)(),...t.components};return e?(0,o.jsx)(e,{...t,children:(0,o.jsx)(d,{...t})}):d()}},11151:(t,e,n)=>{n.d(e,{Z:()=>c,a:()=>s});var o=n(67294);const r={},i=o.createContext(r);function s(t){const e=o.useContext(i);return o.useMemo((function(){return"function"==typeof t?t(e):{...e,...t}}),[e,t])}function c(t){let e;return e=t.disableParentContext?"function"==typeof t.components?t.components(r):t.components||r:s(t.components),o.createElement(i.Provider,{value:e},t.children)}}}]); \ No newline at end of file +"use strict";(self.webpackChunkfi=self.webpackChunkfi||[]).push([[3803],{86427:(t,e,n)=>{n.r(e),n.d(e,{assets:()=>a,contentTitle:()=>s,default:()=>l,frontMatter:()=>i,metadata:()=>c,toc:()=>u});var o=n(85893),r=n(11151);const i={id:"c-intro",title:"Introduction",slug:"/"},s=void 0,c={id:"c-intro",title:"Introduction",description:"",source:"@site/c/00-intro.md",sourceDirName:".",slug:"/",permalink:"/c/",draft:!1,unlisted:!1,editUrl:"https://github.com/mfocko/blog/tree/main/c/00-intro.md",tags:[],version:"current",lastUpdatedAt:1718813674,formattedLastUpdatedAt:"Jun 19, 2024",sidebarPosition:0,frontMatter:{id:"c-intro",title:"Introduction",slug:"/"},sidebar:"autogeneratedBar",next:{title:"Bonuses",permalink:"/c/category/bonuses"}},a={},u=[];function d(t){return(0,o.jsx)(o.Fragment,{})}function l(t={}){const{wrapper:e}={...(0,r.a)(),...t.components};return e?(0,o.jsx)(e,{...t,children:(0,o.jsx)(d,{...t})}):d()}},11151:(t,e,n)=>{n.d(e,{Z:()=>c,a:()=>s});var o=n(67294);const r={},i=o.createContext(r);function s(t){const e=o.useContext(i);return o.useMemo((function(){return"function"==typeof t?t(e):{...e,...t}}),[e,t])}function c(t){let e;return e=t.disableParentContext?"function"==typeof t.components?t.components(r):t.components||r:s(t.components),o.createElement(i.Provider,{value:e},t.children)}}}]); \ No newline at end of file diff --git a/assets/js/84d1e0d8.261de0bf.js b/assets/js/84d1e0d8.8c95a966.js similarity index 97% rename from assets/js/84d1e0d8.261de0bf.js rename to assets/js/84d1e0d8.8c95a966.js index d56732a..a7bfbcc 100644 --- a/assets/js/84d1e0d8.261de0bf.js +++ b/assets/js/84d1e0d8.8c95a966.js @@ -1 +1 @@ -"use strict";(self.webpackChunkfi=self.webpackChunkfi||[]).push([[1885],{49713:(t,e,n)=>{n.r(e),n.d(e,{assets:()=>c,contentTitle:()=>i,default:()=>u,frontMatter:()=>r,metadata:()=>a,toc:()=>d});var o=n(85893),s=n(11151);const r={id:"algorithms-intro",title:"Introduction",slug:"/"},i=void 0,a={id:"algorithms-intro",title:"Introduction",description:"In this part you can find \u201crandom\u201d additional materials I have written over the",source:"@site/algorithms/00-intro.md",sourceDirName:".",slug:"/",permalink:"/algorithms/",draft:!1,unlisted:!1,editUrl:"https://github.com/mfocko/blog/tree/main/algorithms/00-intro.md",tags:[],version:"current",lastUpdatedAt:1718801795,formattedLastUpdatedAt:"Jun 19, 2024",sidebarPosition:0,frontMatter:{id:"algorithms-intro",title:"Introduction",slug:"/"},sidebar:"autogeneratedBar",next:{title:"Algorithms and Correctness",permalink:"/algorithms/category/algorithms-and-correctness"}},c={},d=[];function l(t){const e={a:"a",em:"em",p:"p",...(0,s.a)(),...t.components};return(0,o.jsxs)(o.Fragment,{children:[(0,o.jsxs)(e.p,{children:["In this part you can find \u201crandom\u201d additional materials I have written over the\ncourse of teaching ",(0,o.jsx)(e.em,{children:"Algorithms and data structures I"}),"."]}),"\n",(0,o.jsx)(e.p,{children:"It is a various mix of stuff that may have been produced as a follow-up on some\nquestion asked at the seminar or spontanously."}),"\n",(0,o.jsxs)(e.p,{children:["If you have some ideas for posts, please do not hesitate to submit them as issues\nin the linked ",(0,o.jsx)(e.a,{href:"https://gitlab.fi.muni.cz/xfocko/kb/issues",children:"GitLab"}),"."]})]})}function u(t={}){const{wrapper:e}={...(0,s.a)(),...t.components};return e?(0,o.jsx)(e,{...t,children:(0,o.jsx)(l,{...t})}):l(t)}},11151:(t,e,n)=>{n.d(e,{Z:()=>a,a:()=>i});var o=n(67294);const s={},r=o.createContext(s);function i(t){const e=o.useContext(r);return o.useMemo((function(){return"function"==typeof t?t(e):{...e,...t}}),[e,t])}function a(t){let e;return e=t.disableParentContext?"function"==typeof t.components?t.components(s):t.components||s:i(t.components),o.createElement(r.Provider,{value:e},t.children)}}}]); \ No newline at end of file +"use strict";(self.webpackChunkfi=self.webpackChunkfi||[]).push([[1885],{49713:(t,e,n)=>{n.r(e),n.d(e,{assets:()=>c,contentTitle:()=>i,default:()=>u,frontMatter:()=>r,metadata:()=>a,toc:()=>d});var o=n(85893),s=n(11151);const r={id:"algorithms-intro",title:"Introduction",slug:"/"},i=void 0,a={id:"algorithms-intro",title:"Introduction",description:"In this part you can find \u201crandom\u201d additional materials I have written over the",source:"@site/algorithms/00-intro.md",sourceDirName:".",slug:"/",permalink:"/algorithms/",draft:!1,unlisted:!1,editUrl:"https://github.com/mfocko/blog/tree/main/algorithms/00-intro.md",tags:[],version:"current",lastUpdatedAt:1718813674,formattedLastUpdatedAt:"Jun 19, 2024",sidebarPosition:0,frontMatter:{id:"algorithms-intro",title:"Introduction",slug:"/"},sidebar:"autogeneratedBar",next:{title:"Algorithms and Correctness",permalink:"/algorithms/category/algorithms-and-correctness"}},c={},d=[];function l(t){const e={a:"a",em:"em",p:"p",...(0,s.a)(),...t.components};return(0,o.jsxs)(o.Fragment,{children:[(0,o.jsxs)(e.p,{children:["In this part you can find \u201crandom\u201d additional materials I have written over the\ncourse of teaching ",(0,o.jsx)(e.em,{children:"Algorithms and data structures I"}),"."]}),"\n",(0,o.jsx)(e.p,{children:"It is a various mix of stuff that may have been produced as a follow-up on some\nquestion asked at the seminar or spontanously."}),"\n",(0,o.jsxs)(e.p,{children:["If you have some ideas for posts, please do not hesitate to submit them as issues\nin the linked ",(0,o.jsx)(e.a,{href:"https://gitlab.fi.muni.cz/xfocko/kb/issues",children:"GitLab"}),"."]})]})}function u(t={}){const{wrapper:e}={...(0,s.a)(),...t.components};return e?(0,o.jsx)(e,{...t,children:(0,o.jsx)(l,{...t})}):l(t)}},11151:(t,e,n)=>{n.d(e,{Z:()=>a,a:()=>i});var o=n(67294);const s={},r=o.createContext(s);function i(t){const e=o.useContext(r);return o.useMemo((function(){return"function"==typeof t?t(e):{...e,...t}}),[e,t])}function a(t){let e;return e=t.disableParentContext?"function"==typeof t.components?t.components(s):t.components||s:i(t.components),o.createElement(r.Provider,{value:e},t.children)}}}]); \ No newline at end of file diff --git a/assets/js/b1288602.dbc27f32.js b/assets/js/b1288602.1ec913ee.js similarity index 99% rename from assets/js/b1288602.dbc27f32.js rename to assets/js/b1288602.1ec913ee.js index 8085183..33a7394 100644 --- a/assets/js/b1288602.dbc27f32.js +++ b/assets/js/b1288602.1ec913ee.js @@ -1 +1 @@ -"use strict";(self.webpackChunkfi=self.webpackChunkfi||[]).push([[59],{51456:(e,n,t)=>{t.r(n),t.d(n,{assets:()=>c,contentTitle:()=>i,default:()=>d,frontMatter:()=>o,metadata:()=>a,toc:()=>h});var r=t(85893),s=t(11151);const o={title:"Submitting merge requests"},i="Submitting merge requests for review",a={id:"mr",title:"Submitting merge requests",description:"This tutorial aims to show you how to follow basic git workflow and submit changes",source:"@site/c/mr.md",sourceDirName:".",slug:"/mr",permalink:"/c/mr",draft:!1,unlisted:!1,editUrl:"https://github.com/mfocko/blog/tree/main/c/mr.md",tags:[],version:"current",lastUpdatedAt:1718801795,formattedLastUpdatedAt:"Jun 19, 2024",frontMatter:{title:"Submitting merge requests"},sidebar:"autogeneratedBar",previous:{title:"Practice exam C",permalink:"/c/pexam/cams"}},c={},h=[{value:"Tutorial",id:"tutorial",level:2},{value:"Step #1 - Starting from the clean repository",id:"step-1---starting-from-the-clean-repository",level:3},{value:"Step #2 - Create new branch",id:"step-2---create-new-branch",level:3},{value:"Step #3 - Do the assignment",id:"step-3---do-the-assignment",level:3},{value:"Step #4 - Commit and upload the changes to GitLab",id:"step-4---commit-and-upload-the-changes-to-gitlab",level:3},{value:"Step #5 - Creating a merge request manually",id:"step-5---creating-a-merge-request-manually",level:3},{value:"Step #6 - Set assignees",id:"step-6---set-assignees",level:3},{value:"Step #7 - Return to default branch",id:"step-7---return-to-default-branch",level:3}];function l(e){const n={a:"a",blockquote:"blockquote",code:"code",em:"em",h1:"h1",h2:"h2",h3:"h3",hr:"hr",li:"li",ol:"ol",p:"p",pre:"pre",strong:"strong",...(0,s.a)(),...e.components};return(0,r.jsxs)(r.Fragment,{children:[(0,r.jsx)(n.h1,{id:"submitting-merge-requests-for-review",children:"Submitting merge requests for review"}),"\n",(0,r.jsxs)(n.p,{children:["This tutorial aims to show you how to follow basic git workflow and submit changes\nthrough ",(0,r.jsx)(n.em,{children:"Merge Requests"})," for review."]}),"\n",(0,r.jsxs)(n.p,{children:["The rudimentary idea behind aims for changes to be present on a separate branch\nthat is supposedly ",(0,r.jsx)(n.em,{children:"merged"})," into the default branch. Till then changes can be reviewed\non ",(0,r.jsx)(n.em,{children:"Merge Request"})," and additional changes may be made based on the reviews. Once\nthe changes satisfy requirements, the merge request is merged."]}),"\n",(0,r.jsx)(n.h2,{id:"tutorial",children:"Tutorial"}),"\n",(0,r.jsxs)(n.blockquote,{children:["\n",(0,r.jsxs)(n.p,{children:["Use this tutorial only for bonus assignments ",(0,r.jsx)(n.strong,{children:"made by your tutors"})," or in case\nyou need to make up for the absence."]}),"\n"]}),"\n",(0,r.jsx)(n.h3,{id:"step-1---starting-from-the-clean-repository",children:"Step #1 - Starting from the clean repository"}),"\n",(0,r.jsxs)(n.p,{children:["In your repository (either locally or on aisa) type ",(0,r.jsx)(n.code,{children:"git status"})," and check if your\nrepository is clean and you are present on the main branch (",(0,r.jsx)(n.code,{children:"master"}),", ",(0,r.jsx)(n.code,{children:"main"})," or\n",(0,r.jsx)(n.code,{children:"trunk"}),"). If you do not know what your default branch is, it is probably ",(0,r.jsx)(n.code,{children:"master"}),"\nand you should not be on any other branch."]}),"\n",(0,r.jsx)(n.p,{children:"Output of the command should look like this:"}),"\n",(0,r.jsx)(n.pre,{children:(0,r.jsx)(n.code,{children:"aisa$ git status\nOn branch master # Or main or trunk.\nYour branch is up to date with 'origin/master'.\n\nnothing to commit, working tree clean\n"})}),"\n",(0,r.jsxs)(n.blockquote,{children:["\n",(0,r.jsxs)(n.p,{children:["In case you are on different branch or there are uncommitted changes,\n",(0,r.jsx)(n.strong,{children:"do not continue!!!"})," Clean your repository (commit the changes or discard\nthem), before you continue."]}),"\n"]}),"\n",(0,r.jsx)(n.h3,{id:"step-2---create-new-branch",children:"Step #2 - Create new branch"}),"\n",(0,r.jsx)(n.p,{children:"In your repository write command:"}),"\n",(0,r.jsx)(n.pre,{children:(0,r.jsx)(n.code,{children:"aisa$ git checkout -b BRANCH\nSwitched to a new branch 'BRANCH'\n"})}),"\n",(0,r.jsxs)(n.p,{children:["Instead of ",(0,r.jsx)(n.code,{children:"BRANCH"})," use some reasonable name for the branch. For example if you\nare working on the seminar from 3rd week, name the branch ",(0,r.jsx)(n.code,{children:"seminar-03"}),"."]}),"\n",(0,r.jsx)(n.h3,{id:"step-3---do-the-assignment",children:"Step #3 - Do the assignment"}),"\n",(0,r.jsx)(n.p,{children:"Download the skeleton for the seminar assignment, extract and program. For example\nif you are working on 3rd seminar, you can do so by:"}),"\n",(0,r.jsx)(n.pre,{children:(0,r.jsx)(n.code,{children:"aisa$ wget https://www.fi.muni.cz/pb071/seminars/seminar-03/pb071-seminar-03.zip\naisa$ unzip pb071-seminar-03.zip\n# Now you should have directory 'seminar-03'.\naisa$ rm pb071-seminar-03.zip\naisa$ cd seminar-03\n# You can work on the assignment.\n"})}),"\n",(0,r.jsx)(n.h3,{id:"step-4---commit-and-upload-the-changes-to-gitlab",children:"Step #4 - Commit and upload the changes to GitLab"}),"\n",(0,r.jsxs)(n.p,{children:["The same way you ",(0,r.jsx)(n.em,{children:"add"})," and ",(0,r.jsx)(n.em,{children:"commit"})," files for the homework assignments, you do for\nthe seminar."]}),"\n",(0,r.jsxs)(n.p,{children:["Now you can upload the changes to GitLab. ",(0,r.jsx)(n.code,{children:"git push"})," is not enough, since repository\non GitLab does not know your new branch. You can solve this by adding arguments:"]}),"\n",(0,r.jsx)(n.pre,{children:(0,r.jsx)(n.code,{children:"aisa$ git push origin BRANCH\n...\nremote: To create a merge request for BRANCH, visit:\nremote: https://gitlab.fi.muni.cz/login/pb071/merge_requests/new?merge_request%5Bsource_branch%5D=BRANCH\n...\n"})}),"\n",(0,r.jsx)(n.p,{children:"In the output you should have a link for creating a merge request. If you see this\nlink, open it and skip next step."}),"\n",(0,r.jsx)(n.h3,{id:"step-5---creating-a-merge-request-manually",children:"Step #5 - Creating a merge request manually"}),"\n",(0,r.jsxs)(n.ol,{children:["\n",(0,r.jsx)(n.li,{children:"Open your repository on GitLab."}),"\n",(0,r.jsxs)(n.li,{children:["On the left panel click on ",(0,r.jsx)(n.em,{children:"Merge Requests"}),"."]}),"\n",(0,r.jsxs)(n.li,{children:["Click on ",(0,r.jsx)(n.em,{children:"New Merge Request"}),"."]}),"\n",(0,r.jsxs)(n.li,{children:["In ",(0,r.jsx)(n.em,{children:"Source branch"})," select ",(0,r.jsx)(n.code,{children:"login/pb071"})," and ",(0,r.jsx)(n.code,{children:"BRANCH"}),", which you created."]}),"\n",(0,r.jsxs)(n.li,{children:["In ",(0,r.jsx)(n.em,{children:"Target branch"})," select ",(0,r.jsx)(n.code,{children:"login/pb071"})," and your default branch you have seen\nin the output of the first command. (most likely ",(0,r.jsx)(n.code,{children:"master"}),")"]}),"\n",(0,r.jsxs)(n.li,{children:["Click on ",(0,r.jsx)(n.em,{children:"Compare branches and continue"}),"."]}),"\n"]}),"\n",(0,r.jsx)(n.h3,{id:"step-6---set-assignees",children:"Step #6 - Set assignees"}),"\n",(0,r.jsxs)(n.p,{children:["On the page that is opened, please check at the top that you are creating merge\nrequest ",(0,r.jsx)(n.strong,{children:"from"})," your new branch ",(0,r.jsx)(n.strong,{children:"to"})," your default branch (one of ",(0,r.jsx)(n.code,{children:"master"}),", ",(0,r.jsx)(n.code,{children:"main"}),"\nor ",(0,r.jsx)(n.code,{children:"trunk"}),")."]}),"\n",(0,r.jsxs)(n.p,{children:["Then in the field ",(0,r.jsx)(n.em,{children:"Assignees"})," set your tutors based on the seminar group. You can\nuse login for a quick look up."]}),"\n",(0,r.jsxs)(n.p,{children:["In the end click on ",(0,r.jsx)(n.em,{children:"Submit merge request"}),"."]}),"\n",(0,r.jsx)(n.h3,{id:"step-7---return-to-default-branch",children:"Step #7 - Return to default branch"}),"\n",(0,r.jsx)(n.p,{children:"Homework assignments can be submitted only from branches specified in the rules\nfor the course. Because of that, before you do anything else, you should switch\nback to your default branch."}),"\n",(0,r.jsxs)(n.p,{children:["First of all, same as in step #1, check that your repository is clean with ",(0,r.jsx)(n.code,{children:"git status"}),".\nFor the sake of safety, do not continue without clean repository. Then with command\n",(0,r.jsx)(n.code,{children:"git checkout BRANCH"})," switch to your default branch ",(0,r.jsx)(n.code,{children:"BRANCH"}),"."]}),"\n",(0,r.jsxs)(n.p,{children:["If you do not know which branch is your default, try ",(0,r.jsx)(n.code,{children:"git branch"})," that outputs all branches in your repository. Default branch is typically ",(0,r.jsx)(n.code,{children:"master"}),", but can\nbe ",(0,r.jsx)(n.code,{children:"main"})," or ",(0,r.jsx)(n.code,{children:"trunk"}),"."]}),"\n",(0,r.jsx)(n.pre,{children:(0,r.jsx)(n.code,{children:"aisa$ git status\n# Check if repository is clean\n\n# If you know, what is your default branch, you can skip next command.\naisa$ git branch\n# Find the default branch in the list; should be one of the `master`, `main` or\n# `trunk` and you should not have more than one of those.\n# In case the list clears the terminal and you cannot see shell prompt, you can\n# press `q` to quit the pager.\n\naisa$ git checkout master\n"})}),"\n",(0,r.jsx)(n.hr,{}),"\n",(0,r.jsxs)(n.p,{children:["Adapted from: ",(0,r.jsx)(n.a,{href:"https://www.fi.muni.cz/~xlacko1/pb071/mr.html",children:"https://www.fi.muni.cz/~xlacko1/pb071/mr.html"})]})]})}function d(e={}){const{wrapper:n}={...(0,s.a)(),...e.components};return n?(0,r.jsx)(n,{...e,children:(0,r.jsx)(l,{...e})}):l(e)}},11151:(e,n,t)=>{t.d(n,{Z:()=>a,a:()=>i});var r=t(67294);const s={},o=r.createContext(s);function i(e){const n=r.useContext(o);return r.useMemo((function(){return"function"==typeof e?e(n):{...n,...e}}),[n,e])}function a(e){let n;return n=e.disableParentContext?"function"==typeof e.components?e.components(s):e.components||s:i(e.components),r.createElement(o.Provider,{value:n},e.children)}}}]); \ No newline at end of file +"use strict";(self.webpackChunkfi=self.webpackChunkfi||[]).push([[59],{51456:(e,n,t)=>{t.r(n),t.d(n,{assets:()=>c,contentTitle:()=>i,default:()=>d,frontMatter:()=>o,metadata:()=>a,toc:()=>h});var r=t(85893),s=t(11151);const o={title:"Submitting merge requests"},i="Submitting merge requests for review",a={id:"mr",title:"Submitting merge requests",description:"This tutorial aims to show you how to follow basic git workflow and submit changes",source:"@site/c/mr.md",sourceDirName:".",slug:"/mr",permalink:"/c/mr",draft:!1,unlisted:!1,editUrl:"https://github.com/mfocko/blog/tree/main/c/mr.md",tags:[],version:"current",lastUpdatedAt:1718813674,formattedLastUpdatedAt:"Jun 19, 2024",frontMatter:{title:"Submitting merge requests"},sidebar:"autogeneratedBar",previous:{title:"Practice exam C",permalink:"/c/pexam/cams"}},c={},h=[{value:"Tutorial",id:"tutorial",level:2},{value:"Step #1 - Starting from the clean repository",id:"step-1---starting-from-the-clean-repository",level:3},{value:"Step #2 - Create new branch",id:"step-2---create-new-branch",level:3},{value:"Step #3 - Do the assignment",id:"step-3---do-the-assignment",level:3},{value:"Step #4 - Commit and upload the changes to GitLab",id:"step-4---commit-and-upload-the-changes-to-gitlab",level:3},{value:"Step #5 - Creating a merge request manually",id:"step-5---creating-a-merge-request-manually",level:3},{value:"Step #6 - Set assignees",id:"step-6---set-assignees",level:3},{value:"Step #7 - Return to default branch",id:"step-7---return-to-default-branch",level:3}];function l(e){const n={a:"a",blockquote:"blockquote",code:"code",em:"em",h1:"h1",h2:"h2",h3:"h3",hr:"hr",li:"li",ol:"ol",p:"p",pre:"pre",strong:"strong",...(0,s.a)(),...e.components};return(0,r.jsxs)(r.Fragment,{children:[(0,r.jsx)(n.h1,{id:"submitting-merge-requests-for-review",children:"Submitting merge requests for review"}),"\n",(0,r.jsxs)(n.p,{children:["This tutorial aims to show you how to follow basic git workflow and submit changes\nthrough ",(0,r.jsx)(n.em,{children:"Merge Requests"})," for review."]}),"\n",(0,r.jsxs)(n.p,{children:["The rudimentary idea behind aims for changes to be present on a separate branch\nthat is supposedly ",(0,r.jsx)(n.em,{children:"merged"})," into the default branch. Till then changes can be reviewed\non ",(0,r.jsx)(n.em,{children:"Merge Request"})," and additional changes may be made based on the reviews. Once\nthe changes satisfy requirements, the merge request is merged."]}),"\n",(0,r.jsx)(n.h2,{id:"tutorial",children:"Tutorial"}),"\n",(0,r.jsxs)(n.blockquote,{children:["\n",(0,r.jsxs)(n.p,{children:["Use this tutorial only for bonus assignments ",(0,r.jsx)(n.strong,{children:"made by your tutors"})," or in case\nyou need to make up for the absence."]}),"\n"]}),"\n",(0,r.jsx)(n.h3,{id:"step-1---starting-from-the-clean-repository",children:"Step #1 - Starting from the clean repository"}),"\n",(0,r.jsxs)(n.p,{children:["In your repository (either locally or on aisa) type ",(0,r.jsx)(n.code,{children:"git status"})," and check if your\nrepository is clean and you are present on the main branch (",(0,r.jsx)(n.code,{children:"master"}),", ",(0,r.jsx)(n.code,{children:"main"})," or\n",(0,r.jsx)(n.code,{children:"trunk"}),"). If you do not know what your default branch is, it is probably ",(0,r.jsx)(n.code,{children:"master"}),"\nand you should not be on any other branch."]}),"\n",(0,r.jsx)(n.p,{children:"Output of the command should look like this:"}),"\n",(0,r.jsx)(n.pre,{children:(0,r.jsx)(n.code,{children:"aisa$ git status\nOn branch master # Or main or trunk.\nYour branch is up to date with 'origin/master'.\n\nnothing to commit, working tree clean\n"})}),"\n",(0,r.jsxs)(n.blockquote,{children:["\n",(0,r.jsxs)(n.p,{children:["In case you are on different branch or there are uncommitted changes,\n",(0,r.jsx)(n.strong,{children:"do not continue!!!"})," Clean your repository (commit the changes or discard\nthem), before you continue."]}),"\n"]}),"\n",(0,r.jsx)(n.h3,{id:"step-2---create-new-branch",children:"Step #2 - Create new branch"}),"\n",(0,r.jsx)(n.p,{children:"In your repository write command:"}),"\n",(0,r.jsx)(n.pre,{children:(0,r.jsx)(n.code,{children:"aisa$ git checkout -b BRANCH\nSwitched to a new branch 'BRANCH'\n"})}),"\n",(0,r.jsxs)(n.p,{children:["Instead of ",(0,r.jsx)(n.code,{children:"BRANCH"})," use some reasonable name for the branch. For example if you\nare working on the seminar from 3rd week, name the branch ",(0,r.jsx)(n.code,{children:"seminar-03"}),"."]}),"\n",(0,r.jsx)(n.h3,{id:"step-3---do-the-assignment",children:"Step #3 - Do the assignment"}),"\n",(0,r.jsx)(n.p,{children:"Download the skeleton for the seminar assignment, extract and program. For example\nif you are working on 3rd seminar, you can do so by:"}),"\n",(0,r.jsx)(n.pre,{children:(0,r.jsx)(n.code,{children:"aisa$ wget https://www.fi.muni.cz/pb071/seminars/seminar-03/pb071-seminar-03.zip\naisa$ unzip pb071-seminar-03.zip\n# Now you should have directory 'seminar-03'.\naisa$ rm pb071-seminar-03.zip\naisa$ cd seminar-03\n# You can work on the assignment.\n"})}),"\n",(0,r.jsx)(n.h3,{id:"step-4---commit-and-upload-the-changes-to-gitlab",children:"Step #4 - Commit and upload the changes to GitLab"}),"\n",(0,r.jsxs)(n.p,{children:["The same way you ",(0,r.jsx)(n.em,{children:"add"})," and ",(0,r.jsx)(n.em,{children:"commit"})," files for the homework assignments, you do for\nthe seminar."]}),"\n",(0,r.jsxs)(n.p,{children:["Now you can upload the changes to GitLab. ",(0,r.jsx)(n.code,{children:"git push"})," is not enough, since repository\non GitLab does not know your new branch. You can solve this by adding arguments:"]}),"\n",(0,r.jsx)(n.pre,{children:(0,r.jsx)(n.code,{children:"aisa$ git push origin BRANCH\n...\nremote: To create a merge request for BRANCH, visit:\nremote: https://gitlab.fi.muni.cz/login/pb071/merge_requests/new?merge_request%5Bsource_branch%5D=BRANCH\n...\n"})}),"\n",(0,r.jsx)(n.p,{children:"In the output you should have a link for creating a merge request. If you see this\nlink, open it and skip next step."}),"\n",(0,r.jsx)(n.h3,{id:"step-5---creating-a-merge-request-manually",children:"Step #5 - Creating a merge request manually"}),"\n",(0,r.jsxs)(n.ol,{children:["\n",(0,r.jsx)(n.li,{children:"Open your repository on GitLab."}),"\n",(0,r.jsxs)(n.li,{children:["On the left panel click on ",(0,r.jsx)(n.em,{children:"Merge Requests"}),"."]}),"\n",(0,r.jsxs)(n.li,{children:["Click on ",(0,r.jsx)(n.em,{children:"New Merge Request"}),"."]}),"\n",(0,r.jsxs)(n.li,{children:["In ",(0,r.jsx)(n.em,{children:"Source branch"})," select ",(0,r.jsx)(n.code,{children:"login/pb071"})," and ",(0,r.jsx)(n.code,{children:"BRANCH"}),", which you created."]}),"\n",(0,r.jsxs)(n.li,{children:["In ",(0,r.jsx)(n.em,{children:"Target branch"})," select ",(0,r.jsx)(n.code,{children:"login/pb071"})," and your default branch you have seen\nin the output of the first command. (most likely ",(0,r.jsx)(n.code,{children:"master"}),")"]}),"\n",(0,r.jsxs)(n.li,{children:["Click on ",(0,r.jsx)(n.em,{children:"Compare branches and continue"}),"."]}),"\n"]}),"\n",(0,r.jsx)(n.h3,{id:"step-6---set-assignees",children:"Step #6 - Set assignees"}),"\n",(0,r.jsxs)(n.p,{children:["On the page that is opened, please check at the top that you are creating merge\nrequest ",(0,r.jsx)(n.strong,{children:"from"})," your new branch ",(0,r.jsx)(n.strong,{children:"to"})," your default branch (one of ",(0,r.jsx)(n.code,{children:"master"}),", ",(0,r.jsx)(n.code,{children:"main"}),"\nor ",(0,r.jsx)(n.code,{children:"trunk"}),")."]}),"\n",(0,r.jsxs)(n.p,{children:["Then in the field ",(0,r.jsx)(n.em,{children:"Assignees"})," set your tutors based on the seminar group. You can\nuse login for a quick look up."]}),"\n",(0,r.jsxs)(n.p,{children:["In the end click on ",(0,r.jsx)(n.em,{children:"Submit merge request"}),"."]}),"\n",(0,r.jsx)(n.h3,{id:"step-7---return-to-default-branch",children:"Step #7 - Return to default branch"}),"\n",(0,r.jsx)(n.p,{children:"Homework assignments can be submitted only from branches specified in the rules\nfor the course. Because of that, before you do anything else, you should switch\nback to your default branch."}),"\n",(0,r.jsxs)(n.p,{children:["First of all, same as in step #1, check that your repository is clean with ",(0,r.jsx)(n.code,{children:"git status"}),".\nFor the sake of safety, do not continue without clean repository. Then with command\n",(0,r.jsx)(n.code,{children:"git checkout BRANCH"})," switch to your default branch ",(0,r.jsx)(n.code,{children:"BRANCH"}),"."]}),"\n",(0,r.jsxs)(n.p,{children:["If you do not know which branch is your default, try ",(0,r.jsx)(n.code,{children:"git branch"})," that outputs all branches in your repository. Default branch is typically ",(0,r.jsx)(n.code,{children:"master"}),", but can\nbe ",(0,r.jsx)(n.code,{children:"main"})," or ",(0,r.jsx)(n.code,{children:"trunk"}),"."]}),"\n",(0,r.jsx)(n.pre,{children:(0,r.jsx)(n.code,{children:"aisa$ git status\n# Check if repository is clean\n\n# If you know, what is your default branch, you can skip next command.\naisa$ git branch\n# Find the default branch in the list; should be one of the `master`, `main` or\n# `trunk` and you should not have more than one of those.\n# In case the list clears the terminal and you cannot see shell prompt, you can\n# press `q` to quit the pager.\n\naisa$ git checkout master\n"})}),"\n",(0,r.jsx)(n.hr,{}),"\n",(0,r.jsxs)(n.p,{children:["Adapted from: ",(0,r.jsx)(n.a,{href:"https://www.fi.muni.cz/~xlacko1/pb071/mr.html",children:"https://www.fi.muni.cz/~xlacko1/pb071/mr.html"})]})]})}function d(e={}){const{wrapper:n}={...(0,s.a)(),...e.components};return n?(0,r.jsx)(n,{...e,children:(0,r.jsx)(l,{...e})}):l(e)}},11151:(e,n,t)=>{t.d(n,{Z:()=>a,a:()=>i});var r=t(67294);const s={},o=r.createContext(s);function i(e){const n=r.useContext(o);return r.useMemo((function(){return"function"==typeof e?e(n):{...n,...e}}),[n,e])}function a(e){let n;return n=e.disableParentContext?"function"==typeof e.components?e.components(s):e.components||s:i(e.components),r.createElement(o.Provider,{value:n},e.children)}}}]); \ No newline at end of file diff --git a/assets/js/d05e838c.7590759c.js b/assets/js/d05e838c.6307f3c6.js similarity index 99% rename from assets/js/d05e838c.7590759c.js rename to assets/js/d05e838c.6307f3c6.js index a5f5e92..cd5a7a9 100644 --- a/assets/js/d05e838c.7590759c.js +++ b/assets/js/d05e838c.6307f3c6.js @@ -1 +1 @@ -"use strict";(self.webpackChunkfi=self.webpackChunkfi||[]).push([[6544],{63004:(e,n,r)=>{r.r(n),r.d(n,{assets:()=>d,contentTitle:()=>c,default:()=>a,frontMatter:()=>i,metadata:()=>o,toc:()=>l});var s=r(85893),t=r(11151);const i={id:"seminar-05-06",title:"5th and 6th seminar",description:"200IQ encryption.\n"},c=void 0,o={id:"bonuses/seminar-05-06",title:"5th and 6th seminar",description:"200IQ encryption.\n",source:"@site/c/bonuses/05-06.md",sourceDirName:"bonuses",slug:"/bonuses/seminar-05-06",permalink:"/c/bonuses/seminar-05-06",draft:!1,unlisted:!1,editUrl:"https://github.com/mfocko/blog/tree/main/c/bonuses/05-06.md",tags:[],version:"current",lastUpdatedAt:1718801795,formattedLastUpdatedAt:"Jun 19, 2024",frontMatter:{id:"seminar-05-06",title:"5th and 6th seminar",description:"200IQ encryption.\n"},sidebar:"autogeneratedBar",previous:{title:"4th seminar",permalink:"/c/bonuses/seminar-04"},next:{title:"8th seminar",permalink:"/c/bonuses/seminar-08"}},d={},l=[{value:"Introduction",id:"introduction",level:2},{value:"Task no. 1: Reverse (0.5 K\u20a1)",id:"task-no-1-reverse-05-k",level:3},{value:"Task no. 2: Vigen\xe8re (0.5 K\u20a1)",id:"task-no-2-vigen\xe8re-05-k",level:3},{value:"Bonus part (0.5 K\u20a1)",id:"bonus-part-05-k",level:4},{value:"Task no. 3: Bit madness (0.5 K\u20a1)",id:"task-no-3-bit-madness-05-k",level:3},{value:"Task no. 4: All combined to BMP (0.5 K\u20a1)",id:"task-no-4-all-combined-to-bmp-05-k",level:3},{value:"Submitting",id:"submitting",level:2}];function h(e){const n={a:"a",code:"code",h2:"h2",h3:"h3",h4:"h4",hr:"hr",li:"li",ol:"ol",p:"p",pre:"pre",ul:"ul",...(0,t.a)(),...e.components};return(0,s.jsxs)(s.Fragment,{children:[(0,s.jsx)(n.p,{children:"For this bonus you can get at maximum 2.5 K\u20a1."}),"\n",(0,s.jsx)(n.p,{children:(0,s.jsx)(n.a,{href:"pathname:///files/c/bonuses/05-06.tar.gz",children:"Source"})}),"\n",(0,s.jsx)(n.h2,{id:"introduction",children:"Introduction"}),"\n",(0,s.jsx)(n.p,{children:"In this bonus you will implement few functions that will be used together for\nimplementing a very special cipher."}),"\n",(0,s.jsx)(n.h3,{id:"task-no-1-reverse-05-k",children:"Task no. 1: Reverse (0.5 K\u20a1)"}),"\n",(0,s.jsxs)(n.p,{children:["Write a function ",(0,s.jsx)(n.code,{children:"char* reverse(const char* text)"})," that returns copy of the input\nstring in reversed order (also uppercase)."]}),"\n",(0,s.jsxs)(n.p,{children:["In case you are given ",(0,s.jsx)(n.code,{children:"NULL"}),", return ",(0,s.jsx)(n.code,{children:"NULL"}),"."]}),"\n",(0,s.jsx)(n.p,{children:"Example (more in tests):"}),"\n",(0,s.jsx)(n.pre,{children:(0,s.jsx)(n.code,{className:"language-c",children:'char* reversed = reverse("Hello world!");\n\nprintf("%s\\n", reversed);\n// "!DLROW OLLEH"\n\nif (reversed != NULL) {\n free(reversed);\n}\n'})}),"\n",(0,s.jsx)(n.h3,{id:"task-no-2-vigen\xe8re-05-k",children:"Task no. 2: Vigen\xe8re (0.5 K\u20a1)"}),"\n",(0,s.jsx)(n.p,{children:"Vigen\xe8re cipher is similar to the Caesar cipher, but you also have a key that is\nused for encrypting (or decrypting)."}),"\n",(0,s.jsx)(n.p,{children:"Your task is to write two functions:"}),"\n",(0,s.jsxs)(n.ul,{children:["\n",(0,s.jsxs)(n.li,{children:[(0,s.jsx)(n.code,{children:"char* vigenere_encrypt(const char* key, const char* text)"})," for encrypting"]}),"\n",(0,s.jsxs)(n.li,{children:[(0,s.jsx)(n.code,{children:"char* vigenere_decrypt(const char* key, const char* text)"})," for decrypting"]}),"\n"]}),"\n",(0,s.jsx)(n.p,{children:"In both of those you should return uppercase characters."}),"\n",(0,s.jsx)(n.p,{children:"Meaning of the parameters you are given:"}),"\n",(0,s.jsxs)(n.ul,{children:["\n",(0,s.jsxs)(n.li,{children:[(0,s.jsx)(n.code,{children:"key"})," - String that represents key that is used for *crypting. It consists of\none word and can have only characters of the alphabet. Does not matter if they\nare uppercase or lowercase."]}),"\n",(0,s.jsxs)(n.li,{children:[(0,s.jsx)(n.code,{children:"text"})," - String that is to be *crypted."]}),"\n"]}),"\n",(0,s.jsxs)(n.p,{children:["Function returns address of the encrypted (or decrypted) string. Or ",(0,s.jsx)(n.code,{children:"NULL"})," in case\nerror occurs."]}),"\n",(0,s.jsx)(n.p,{children:"Example:"}),"\n",(0,s.jsx)(n.pre,{children:(0,s.jsx)(n.code,{className:"language-c",children:'char *encrypted = vigenere_encrypt("CoMPuTeR", "Hello world!");\n\nprintf("%s\\n", encrypted);\n// "JSXAI PSINR!"\n\nif (encrypted != NULL) {\n free(encrypted)\n}\n'})}),"\n",(0,s.jsx)(n.h4,{id:"bonus-part-05-k",children:"Bonus part (0.5 K\u20a1)"}),"\n",(0,s.jsx)(n.p,{children:"If you can utilize helper function that would do both encrypting and decrypting,\nyou can gain 0.5 K\u20a1."}),"\n",(0,s.jsxs)(n.p,{children:["Usage of ",(0,s.jsx)(n.code,{children:"true"}),"/",(0,s.jsx)(n.code,{children:"false"})," to decide path in code is prohibited. It leads to merging\nof both functions into one. Point of this part is to discover a way to do this\ngenerically in such way that there are no separate paths for one or the other. One\nfunction with no branching for both of them, parametrization is your friend :)"]}),"\n",(0,s.jsx)(n.h3,{id:"task-no-3-bit-madness-05-k",children:"Task no. 3: Bit madness (0.5 K\u20a1)"}),"\n",(0,s.jsx)(n.p,{children:"This is a state of the art crypto. Please do not share :)"}),"\n",(0,s.jsx)(n.p,{children:"For encrypting:"}),"\n",(0,s.jsxs)(n.ol,{children:["\n",(0,s.jsx)(n.li,{children:"Split the character that is to be encrypted in halves (4 and 4 bits each)."}),"\n",(0,s.jsx)(n.li,{children:"Bits in 1st half are to be split into pairs. Swap bits in those pairs."}),"\n",(0,s.jsxs)(n.li,{children:["Then use the 4 bits that you created in the 2nd step for ",(0,s.jsx)(n.code,{children:"XOR"})," with the other\n4 bits."]}),"\n"]}),"\n",(0,s.jsxs)(n.p,{children:["This simple and ingenious principle will be illustrated on the following example.\nString we want to encrypt is ",(0,s.jsx)(n.code,{children:"Hello world!"}),". We need to encrypt each letter separately,\nso we will demonstrate on letter ",(0,s.jsx)(n.code,{children:"H"}),":"]}),"\n",(0,s.jsxs)(n.ol,{children:["\n",(0,s.jsxs)(n.li,{children:["\n",(0,s.jsxs)(n.p,{children:["Letter ",(0,s.jsx)(n.code,{children:"H"})," is represented in ASCII as ",(0,s.jsx)(n.code,{children:"72"}),"."]}),"\n",(0,s.jsxs)(n.p,{children:[(0,s.jsx)(n.code,{children:"72"})," represented in binary is: ",(0,s.jsx)(n.code,{children:"01001000"}),". So first 4 bits are: ",(0,s.jsx)(n.code,{children:"0100"})," and last\n4 bits are ",(0,s.jsx)(n.code,{children:"1000"}),"."]}),"\n"]}),"\n",(0,s.jsxs)(n.li,{children:["\n",(0,s.jsxs)(n.p,{children:["First half of bits (",(0,s.jsx)(n.code,{children:"0100"}),") consists of 2 pairs (",(0,s.jsx)(n.code,{children:"01"})," and ",(0,s.jsx)(n.code,{children:"00"}),") which we swap\n(",(0,s.jsx)(n.code,{children:"01 ~> 10"})," and ",(0,s.jsx)(n.code,{children:"00 ~> 00"}),"). That way we get ",(0,s.jsx)(n.code,{children:"1000"}),"."]}),"\n"]}),"\n",(0,s.jsxs)(n.li,{children:["\n",(0,s.jsx)(n.p,{children:"That half is used for xor with the other 4 bits:"}),"\n",(0,s.jsx)(n.pre,{children:(0,s.jsx)(n.code,{children:" 1000 // second half\nXOR 1000 // first half after 2nd step\n--------\n 0000\n"})}),"\n"]}),"\n",(0,s.jsxs)(n.li,{children:["\n",(0,s.jsxs)(n.p,{children:["Now we combine both halves (first one is ",(0,s.jsx)(n.code,{children:"1000"}),", which we got from the 2nd step\nand second one is ",(0,s.jsx)(n.code,{children:"0000"}),", which we got from the 3rd step) and get ",(0,s.jsx)(n.code,{children:"10000000"}),",\nwhich is encrypted character ",(0,s.jsx)(n.code,{children:"H"})," using this method."]}),"\n"]}),"\n"]}),"\n",(0,s.jsx)(n.p,{children:"In case of decryption, reverse those steps."}),"\n",(0,s.jsx)(n.p,{children:"Your task is to implement functions:"}),"\n",(0,s.jsxs)(n.ul,{children:["\n",(0,s.jsx)(n.li,{children:(0,s.jsx)(n.code,{children:"unsigned char* bit_encrypt(const char* text)"})}),"\n",(0,s.jsx)(n.li,{children:(0,s.jsx)(n.code,{children:"char* bit_decrypt(const unsigned char* text)"})}),"\n"]}),"\n",(0,s.jsx)(n.p,{children:"Example:"}),"\n",(0,s.jsx)(n.pre,{children:(0,s.jsx)(n.code,{className:"language-c",children:'unsigned char* encrypted = bit_encrypt("Hello world!");\n\nfor (int i = 0; i < 12;i++) {\n printf("%x ", encrypted[i]);\n //80 9c 95 95 96 11 bc 96 b9 95 9d 10\n}\n\nif (encrypted != NULL) {\n free(encrypted);\n}\n'})}),"\n",(0,s.jsx)(n.h3,{id:"task-no-4-all-combined-to-bmp-05-k",children:"Task no. 4: All combined to BMP (0.5 K\u20a1)"}),"\n",(0,s.jsx)(n.p,{children:"Authors of the BMP cipher are non-disclosed :)"}),"\n",(0,s.jsx)(n.p,{children:"Create pair of functions:"}),"\n",(0,s.jsxs)(n.ul,{children:["\n",(0,s.jsx)(n.li,{children:(0,s.jsx)(n.code,{children:"unsigned char* bmp_encrypt(const char* key, const char* text)"})}),"\n",(0,s.jsx)(n.li,{children:(0,s.jsx)(n.code,{children:"char* bmp_decrypt(const char* key, const unsigned char* text)"})}),"\n"]}),"\n",(0,s.jsx)(n.p,{children:"BMP cipher consists of following steps for encrypting:"}),"\n",(0,s.jsxs)(n.ol,{children:["\n",(0,s.jsx)(n.li,{children:"Reverse the input string"}),"\n",(0,s.jsx)(n.li,{children:"Use Vigenere on the string you got from step #1"}),"\n",(0,s.jsx)(n.li,{children:"Use bit madness on the string you got from step #2"}),"\n"]}),"\n",(0,s.jsx)(n.p,{children:"For decrypting, reverse the steps."}),"\n",(0,s.jsx)(n.h2,{id:"submitting",children:"Submitting"}),"\n",(0,s.jsx)(n.p,{children:"In case you have any questions, feel free to reach out to me."}),"\n",(0,s.jsx)(n.hr,{})]})}function a(e={}){const{wrapper:n}={...(0,t.a)(),...e.components};return n?(0,s.jsx)(n,{...e,children:(0,s.jsx)(h,{...e})}):h(e)}},11151:(e,n,r)=>{r.d(n,{Z:()=>o,a:()=>c});var s=r(67294);const t={},i=s.createContext(t);function c(e){const n=s.useContext(i);return s.useMemo((function(){return"function"==typeof e?e(n):{...n,...e}}),[n,e])}function o(e){let n;return n=e.disableParentContext?"function"==typeof e.components?e.components(t):e.components||t:c(e.components),s.createElement(i.Provider,{value:n},e.children)}}}]); \ No newline at end of file +"use strict";(self.webpackChunkfi=self.webpackChunkfi||[]).push([[6544],{63004:(e,n,r)=>{r.r(n),r.d(n,{assets:()=>d,contentTitle:()=>c,default:()=>a,frontMatter:()=>i,metadata:()=>o,toc:()=>l});var s=r(85893),t=r(11151);const i={id:"seminar-05-06",title:"5th and 6th seminar",description:"200IQ encryption.\n"},c=void 0,o={id:"bonuses/seminar-05-06",title:"5th and 6th seminar",description:"200IQ encryption.\n",source:"@site/c/bonuses/05-06.md",sourceDirName:"bonuses",slug:"/bonuses/seminar-05-06",permalink:"/c/bonuses/seminar-05-06",draft:!1,unlisted:!1,editUrl:"https://github.com/mfocko/blog/tree/main/c/bonuses/05-06.md",tags:[],version:"current",lastUpdatedAt:1718813674,formattedLastUpdatedAt:"Jun 19, 2024",frontMatter:{id:"seminar-05-06",title:"5th and 6th seminar",description:"200IQ encryption.\n"},sidebar:"autogeneratedBar",previous:{title:"4th seminar",permalink:"/c/bonuses/seminar-04"},next:{title:"8th seminar",permalink:"/c/bonuses/seminar-08"}},d={},l=[{value:"Introduction",id:"introduction",level:2},{value:"Task no. 1: Reverse (0.5 K\u20a1)",id:"task-no-1-reverse-05-k",level:3},{value:"Task no. 2: Vigen\xe8re (0.5 K\u20a1)",id:"task-no-2-vigen\xe8re-05-k",level:3},{value:"Bonus part (0.5 K\u20a1)",id:"bonus-part-05-k",level:4},{value:"Task no. 3: Bit madness (0.5 K\u20a1)",id:"task-no-3-bit-madness-05-k",level:3},{value:"Task no. 4: All combined to BMP (0.5 K\u20a1)",id:"task-no-4-all-combined-to-bmp-05-k",level:3},{value:"Submitting",id:"submitting",level:2}];function h(e){const n={a:"a",code:"code",h2:"h2",h3:"h3",h4:"h4",hr:"hr",li:"li",ol:"ol",p:"p",pre:"pre",ul:"ul",...(0,t.a)(),...e.components};return(0,s.jsxs)(s.Fragment,{children:[(0,s.jsx)(n.p,{children:"For this bonus you can get at maximum 2.5 K\u20a1."}),"\n",(0,s.jsx)(n.p,{children:(0,s.jsx)(n.a,{href:"pathname:///files/c/bonuses/05-06.tar.gz",children:"Source"})}),"\n",(0,s.jsx)(n.h2,{id:"introduction",children:"Introduction"}),"\n",(0,s.jsx)(n.p,{children:"In this bonus you will implement few functions that will be used together for\nimplementing a very special cipher."}),"\n",(0,s.jsx)(n.h3,{id:"task-no-1-reverse-05-k",children:"Task no. 1: Reverse (0.5 K\u20a1)"}),"\n",(0,s.jsxs)(n.p,{children:["Write a function ",(0,s.jsx)(n.code,{children:"char* reverse(const char* text)"})," that returns copy of the input\nstring in reversed order (also uppercase)."]}),"\n",(0,s.jsxs)(n.p,{children:["In case you are given ",(0,s.jsx)(n.code,{children:"NULL"}),", return ",(0,s.jsx)(n.code,{children:"NULL"}),"."]}),"\n",(0,s.jsx)(n.p,{children:"Example (more in tests):"}),"\n",(0,s.jsx)(n.pre,{children:(0,s.jsx)(n.code,{className:"language-c",children:'char* reversed = reverse("Hello world!");\n\nprintf("%s\\n", reversed);\n// "!DLROW OLLEH"\n\nif (reversed != NULL) {\n free(reversed);\n}\n'})}),"\n",(0,s.jsx)(n.h3,{id:"task-no-2-vigen\xe8re-05-k",children:"Task no. 2: Vigen\xe8re (0.5 K\u20a1)"}),"\n",(0,s.jsx)(n.p,{children:"Vigen\xe8re cipher is similar to the Caesar cipher, but you also have a key that is\nused for encrypting (or decrypting)."}),"\n",(0,s.jsx)(n.p,{children:"Your task is to write two functions:"}),"\n",(0,s.jsxs)(n.ul,{children:["\n",(0,s.jsxs)(n.li,{children:[(0,s.jsx)(n.code,{children:"char* vigenere_encrypt(const char* key, const char* text)"})," for encrypting"]}),"\n",(0,s.jsxs)(n.li,{children:[(0,s.jsx)(n.code,{children:"char* vigenere_decrypt(const char* key, const char* text)"})," for decrypting"]}),"\n"]}),"\n",(0,s.jsx)(n.p,{children:"In both of those you should return uppercase characters."}),"\n",(0,s.jsx)(n.p,{children:"Meaning of the parameters you are given:"}),"\n",(0,s.jsxs)(n.ul,{children:["\n",(0,s.jsxs)(n.li,{children:[(0,s.jsx)(n.code,{children:"key"})," - String that represents key that is used for *crypting. It consists of\none word and can have only characters of the alphabet. Does not matter if they\nare uppercase or lowercase."]}),"\n",(0,s.jsxs)(n.li,{children:[(0,s.jsx)(n.code,{children:"text"})," - String that is to be *crypted."]}),"\n"]}),"\n",(0,s.jsxs)(n.p,{children:["Function returns address of the encrypted (or decrypted) string. Or ",(0,s.jsx)(n.code,{children:"NULL"})," in case\nerror occurs."]}),"\n",(0,s.jsx)(n.p,{children:"Example:"}),"\n",(0,s.jsx)(n.pre,{children:(0,s.jsx)(n.code,{className:"language-c",children:'char *encrypted = vigenere_encrypt("CoMPuTeR", "Hello world!");\n\nprintf("%s\\n", encrypted);\n// "JSXAI PSINR!"\n\nif (encrypted != NULL) {\n free(encrypted)\n}\n'})}),"\n",(0,s.jsx)(n.h4,{id:"bonus-part-05-k",children:"Bonus part (0.5 K\u20a1)"}),"\n",(0,s.jsx)(n.p,{children:"If you can utilize helper function that would do both encrypting and decrypting,\nyou can gain 0.5 K\u20a1."}),"\n",(0,s.jsxs)(n.p,{children:["Usage of ",(0,s.jsx)(n.code,{children:"true"}),"/",(0,s.jsx)(n.code,{children:"false"})," to decide path in code is prohibited. It leads to merging\nof both functions into one. Point of this part is to discover a way to do this\ngenerically in such way that there are no separate paths for one or the other. One\nfunction with no branching for both of them, parametrization is your friend :)"]}),"\n",(0,s.jsx)(n.h3,{id:"task-no-3-bit-madness-05-k",children:"Task no. 3: Bit madness (0.5 K\u20a1)"}),"\n",(0,s.jsx)(n.p,{children:"This is a state of the art crypto. Please do not share :)"}),"\n",(0,s.jsx)(n.p,{children:"For encrypting:"}),"\n",(0,s.jsxs)(n.ol,{children:["\n",(0,s.jsx)(n.li,{children:"Split the character that is to be encrypted in halves (4 and 4 bits each)."}),"\n",(0,s.jsx)(n.li,{children:"Bits in 1st half are to be split into pairs. Swap bits in those pairs."}),"\n",(0,s.jsxs)(n.li,{children:["Then use the 4 bits that you created in the 2nd step for ",(0,s.jsx)(n.code,{children:"XOR"})," with the other\n4 bits."]}),"\n"]}),"\n",(0,s.jsxs)(n.p,{children:["This simple and ingenious principle will be illustrated on the following example.\nString we want to encrypt is ",(0,s.jsx)(n.code,{children:"Hello world!"}),". We need to encrypt each letter separately,\nso we will demonstrate on letter ",(0,s.jsx)(n.code,{children:"H"}),":"]}),"\n",(0,s.jsxs)(n.ol,{children:["\n",(0,s.jsxs)(n.li,{children:["\n",(0,s.jsxs)(n.p,{children:["Letter ",(0,s.jsx)(n.code,{children:"H"})," is represented in ASCII as ",(0,s.jsx)(n.code,{children:"72"}),"."]}),"\n",(0,s.jsxs)(n.p,{children:[(0,s.jsx)(n.code,{children:"72"})," represented in binary is: ",(0,s.jsx)(n.code,{children:"01001000"}),". So first 4 bits are: ",(0,s.jsx)(n.code,{children:"0100"})," and last\n4 bits are ",(0,s.jsx)(n.code,{children:"1000"}),"."]}),"\n"]}),"\n",(0,s.jsxs)(n.li,{children:["\n",(0,s.jsxs)(n.p,{children:["First half of bits (",(0,s.jsx)(n.code,{children:"0100"}),") consists of 2 pairs (",(0,s.jsx)(n.code,{children:"01"})," and ",(0,s.jsx)(n.code,{children:"00"}),") which we swap\n(",(0,s.jsx)(n.code,{children:"01 ~> 10"})," and ",(0,s.jsx)(n.code,{children:"00 ~> 00"}),"). That way we get ",(0,s.jsx)(n.code,{children:"1000"}),"."]}),"\n"]}),"\n",(0,s.jsxs)(n.li,{children:["\n",(0,s.jsx)(n.p,{children:"That half is used for xor with the other 4 bits:"}),"\n",(0,s.jsx)(n.pre,{children:(0,s.jsx)(n.code,{children:" 1000 // second half\nXOR 1000 // first half after 2nd step\n--------\n 0000\n"})}),"\n"]}),"\n",(0,s.jsxs)(n.li,{children:["\n",(0,s.jsxs)(n.p,{children:["Now we combine both halves (first one is ",(0,s.jsx)(n.code,{children:"1000"}),", which we got from the 2nd step\nand second one is ",(0,s.jsx)(n.code,{children:"0000"}),", which we got from the 3rd step) and get ",(0,s.jsx)(n.code,{children:"10000000"}),",\nwhich is encrypted character ",(0,s.jsx)(n.code,{children:"H"})," using this method."]}),"\n"]}),"\n"]}),"\n",(0,s.jsx)(n.p,{children:"In case of decryption, reverse those steps."}),"\n",(0,s.jsx)(n.p,{children:"Your task is to implement functions:"}),"\n",(0,s.jsxs)(n.ul,{children:["\n",(0,s.jsx)(n.li,{children:(0,s.jsx)(n.code,{children:"unsigned char* bit_encrypt(const char* text)"})}),"\n",(0,s.jsx)(n.li,{children:(0,s.jsx)(n.code,{children:"char* bit_decrypt(const unsigned char* text)"})}),"\n"]}),"\n",(0,s.jsx)(n.p,{children:"Example:"}),"\n",(0,s.jsx)(n.pre,{children:(0,s.jsx)(n.code,{className:"language-c",children:'unsigned char* encrypted = bit_encrypt("Hello world!");\n\nfor (int i = 0; i < 12;i++) {\n printf("%x ", encrypted[i]);\n //80 9c 95 95 96 11 bc 96 b9 95 9d 10\n}\n\nif (encrypted != NULL) {\n free(encrypted);\n}\n'})}),"\n",(0,s.jsx)(n.h3,{id:"task-no-4-all-combined-to-bmp-05-k",children:"Task no. 4: All combined to BMP (0.5 K\u20a1)"}),"\n",(0,s.jsx)(n.p,{children:"Authors of the BMP cipher are non-disclosed :)"}),"\n",(0,s.jsx)(n.p,{children:"Create pair of functions:"}),"\n",(0,s.jsxs)(n.ul,{children:["\n",(0,s.jsx)(n.li,{children:(0,s.jsx)(n.code,{children:"unsigned char* bmp_encrypt(const char* key, const char* text)"})}),"\n",(0,s.jsx)(n.li,{children:(0,s.jsx)(n.code,{children:"char* bmp_decrypt(const char* key, const unsigned char* text)"})}),"\n"]}),"\n",(0,s.jsx)(n.p,{children:"BMP cipher consists of following steps for encrypting:"}),"\n",(0,s.jsxs)(n.ol,{children:["\n",(0,s.jsx)(n.li,{children:"Reverse the input string"}),"\n",(0,s.jsx)(n.li,{children:"Use Vigenere on the string you got from step #1"}),"\n",(0,s.jsx)(n.li,{children:"Use bit madness on the string you got from step #2"}),"\n"]}),"\n",(0,s.jsx)(n.p,{children:"For decrypting, reverse the steps."}),"\n",(0,s.jsx)(n.h2,{id:"submitting",children:"Submitting"}),"\n",(0,s.jsx)(n.p,{children:"In case you have any questions, feel free to reach out to me."}),"\n",(0,s.jsx)(n.hr,{})]})}function a(e={}){const{wrapper:n}={...(0,t.a)(),...e.components};return n?(0,s.jsx)(n,{...e,children:(0,s.jsx)(h,{...e})}):h(e)}},11151:(e,n,r)=>{r.d(n,{Z:()=>o,a:()=>c});var s=r(67294);const t={},i=s.createContext(t);function c(e){const n=s.useContext(i);return s.useMemo((function(){return"function"==typeof e?e(n):{...n,...e}}),[n,e])}function o(e){let n;return n=e.disableParentContext?"function"==typeof e.components?e.components(t):e.components||t:c(e.components),s.createElement(i.Provider,{value:n},e.children)}}}]); \ No newline at end of file diff --git a/assets/js/main.8dc1a0ec.js b/assets/js/main.7af15fef.js similarity index 98% rename from assets/js/main.8dc1a0ec.js rename to assets/js/main.7af15fef.js index 6941ef8..ee14ecf 100644 --- a/assets/js/main.8dc1a0ec.js +++ b/assets/js/main.7af15fef.js @@ -1,2 +1,2 @@ -/*! For license information please see main.8dc1a0ec.js.LICENSE.txt */ -(self.webpackChunkfi=self.webpackChunkfi||[]).push([[179],{20830:(e,t,n)=>{"use strict";n.d(t,{W:()=>a});var r=n(67294);function a(){return r.createElement("svg",{width:"20",height:"20",className:"DocSearch-Search-Icon",viewBox:"0 0 20 20"},r.createElement("path",{d:"M14.386 14.386l4.0877 4.0877-4.0877-4.0877c-2.9418 2.9419-7.7115 2.9419-10.6533 0-2.9419-2.9418-2.9419-7.7115 0-10.6533 2.9418-2.9419 7.7115-2.9419 10.6533 0 2.9419 2.9418 2.9419 7.7115 0 10.6533z",stroke:"currentColor",fill:"none",fillRule:"evenodd",strokeLinecap:"round",strokeLinejoin:"round"}))}},723:(e,t,n)=>{"use strict";n.d(t,{Z:()=>p});n(67294);var r=n(68356),a=n.n(r),o=n(16887);const i={"0123bc76":[()=>n.e(3734).then(n.t.bind(n,76554,19)),"~docs/algorithms/tag-algorithms-tags-c-e22.json",76554],"0178f9ad":[()=>n.e(9898).then(n.bind(n,35610)),"@site/algorithms/08-rb-trees/2022-04-05-applications.md",35610],"01a85c17":[()=>Promise.all([n.e(532),n.e(4013)]).then(n.bind(n,24524)),"@theme/BlogTagsListPage",24524],"0220f5fc":[()=>n.e(1378).then(n.t.bind(n,85804,19)),"/home/runner/work/blog/blog/.docusaurus/docusaurus-plugin-content-blog/blog/plugin-route-context-module-100.json",85804],"0608d96f":[()=>n.e(7568).then(n.t.bind(n,77158,19)),"~blog/blog/blog-tags-vps-843-list.json",77158],"06c4a8fc":[()=>n.e(2125).then(n.t.bind(n,4697,19)),"~docs/algorithms/tag-algorithms-tags-testing-0c4.json",4697],"0816068a":[()=>n.e(2948).then(n.t.bind(n,17702,19)),"~blog/blog/blog-tags-hype-1ee.json",17702],"087808f1":[()=>n.e(3731).then(n.bind(n,48157)),"@site/algorithms/12-hash-tables/2023-11-28-breaking/index.md",48157],"08dfa3a2":[()=>n.e(2606).then(n.t.bind(n,32412,19)),"~docs/algorithms/tag-algorithms-tags-astar-f6e.json",32412],"0bfe45d5":[()=>n.e(4269).then(n.t.bind(n,13847,19)),"~blog/blog/blog-tags-rust-0c9-list.json",13847],"0fcbc6ca":[()=>Promise.all([n.e(532),n.e(1851)]).then(n.bind(n,39900)),"@site/src/pages/talks.tsx",39900],"146d9b84":[()=>n.e(9300).then(n.t.bind(n,96671,19)),"~blog/blog/blog-tags-admin-b05-list.json",96671],"14eb3368":[()=>Promise.all([n.e(532),n.e(9817)]).then(n.bind(n,34228)),"@theme/DocCategoryGeneratedIndexPage",34228],"1535ede8":[()=>n.e(5376).then(n.bind(n,44969)),"@site/c/bonuses/10.md",44969],15966941:[()=>n.e(8326).then(n.bind(n,16721)),"@site/algorithms/12-hash-tables/2023-11-28-breaking/02-mitigations.md",16721],"16cbc838":[()=>n.e(1494).then(n.t.bind(n,98252,19)),"~docs/algorithms/tag-algorithms-tags-iterative-d5b.json",98252],"171c9bb5":[()=>n.e(5329).then(n.t.bind(n,27615,19)),"~blog/blog/blog-tags-devconf-979.json",27615],17896441:[()=>Promise.all([n.e(532),n.e(9365),n.e(7918)]).then(n.bind(n,15154)),"@theme/DocItem",15154],"182b5a8d":[()=>n.e(6048).then(n.bind(n,32582)),"@site/blog/2024-01-28-rust-opinion.md?truncated=true",32582],"19d7c045":[()=>n.e(4637).then(n.t.bind(n,67772,19)),"~blog/blog/blog-tags-advent-of-code-49f.json",67772],"1a2684fd":[()=>n.e(5688).then(n.t.bind(n,99303,19)),"~blog/blog/blog-tags-fedora-bd8.json",99303],"1a4e3797":[()=>Promise.all([n.e(532),n.e(7920)]).then(n.bind(n,48852)),"@theme/SearchPage",48852],"1a606400":[()=>n.e(494).then(n.t.bind(n,82400,19)),"/home/runner/work/blog/blog/.docusaurus/docusaurus-plugin-content-docs/algorithms/plugin-route-context-module-100.json",82400],"1acf65cc":[()=>n.e(8529).then(n.bind(n,34568)),"@site/c/pexam/b-garbage_collect.md",34568],"1cd58e77":[()=>n.e(1547).then(n.bind(n,32090)),"@site/algorithms/04-recursion/2023-08-17-pyramid-slide-down/04-bottom-up-dp.md",32090],"22a175ec":[()=>Promise.all([n.e(532),n.e(6890)]).then(n.bind(n,40707)),"@site/src/pages/contributions.tsx",40707],"24fecc0a":[()=>n.e(3707).then(n.bind(n,69383)),"@site/algorithms/03-time-complexity/2021-03-31-extend.md",69383],"257fa000":[()=>n.e(9595).then(n.t.bind(n,35455,19)),"~blog/blog/blog-tags-cult-e97-list.json",35455],"28d80ff8":[()=>n.e(6435).then(n.t.bind(n,7465,19)),"~docs/algorithms/tag-algorithms-tags-sorting-d73.json",7465],29694455:[()=>n.e(3388).then(n.t.bind(n,39828,19)),"~blog/blog/blog-tags-iterators-977-list.json",39828],"2af5d0a7":[()=>n.e(3979).then(n.t.bind(n,53703,19)),"~docs/algorithms/tag-algorithms-tags-a-star-775.json",53703],"2b89902a":[()=>n.e(6342).then(n.t.bind(n,45443,19)),"~docs/algorithms/tag-algorithms-tags-recursion-1bd.json",45443],"2ca64e35":[()=>n.e(281).then(n.bind(n,99544)),"@site/algorithms/04-recursion/2022-11-29-karel/index.md",99544],"2d2e3e59":[()=>n.e(6689).then(n.bind(n,55268)),"@site/blog/2024-06-19-devconf-2024.md",55268],"2fcf0558":[()=>n.e(4638).then(n.t.bind(n,69470,19)),"~docs/algorithms/category-algorithms-autogeneratedbar-category-hash-tables-062.json",69470],"3011a4c0":[()=>n.e(7926).then(n.t.bind(n,31670,19)),"~blog/blog/blog-tags-copr-70b-list.json",31670],30814625:[()=>n.e(115).then(n.bind(n,78416)),"@site/algorithms/04-recursion/2022-11-29-karel/2023-12-24-solution.md",78416],"3238adfd":[()=>n.e(7285).then(n.t.bind(n,16107,19)),"~blog/blog/blog-tags-lts-b6c.json",16107],"34ab65f4":[()=>n.e(3220).then(n.t.bind(n,28865,19)),"~docs/algorithms/tag-algorithms-tags-postconditions-1f3.json",28865],"34df9f28":[()=>n.e(9977).then(n.t.bind(n,59267,19)),"~blog/blog/blog-tags-paywall-11b.json",59267],"354a7b72":[()=>n.e(9414).then(n.bind(n,46617)),"@site/algorithms/10-graphs/2022-04-30-bfs-tree.md",46617],"3716fece":[()=>n.e(1511).then(n.bind(n,76225)),"@site/blog/2024-06-19-devconf-2024.md?truncated=true",76225],"3720c009":[()=>Promise.all([n.e(532),n.e(3751)]).then(n.bind(n,10727)),"@theme/DocTagsListPage",10727],"377f3aa1":[()=>n.e(1011).then(n.bind(n,7582)),"@site/blog/aoc-2022/02-week-2.md",7582],"3a0bc46c":[()=>n.e(7524).then(n.t.bind(n,80975,19)),"~blog/blog/blog-tags-fedora-bd8-list.json",80975],"3adcbc3a":[()=>n.e(5701).then(n.bind(n,62535)),"@site/algorithms/11-paths/2024-01-01-bf-to-astar/01-bf.md",62535],"3d92ba6e":[()=>n.e(8236).then(n.t.bind(n,77778,19)),"~docs/algorithms/tag-algorithms-tags-dijkstra-48e.json",77778],"3da4b779":[()=>n.e(2177).then(n.bind(n,28737)),"@site/blog/aoc-2022/04-week-4.md",28737],"4200b1a9":[()=>n.e(866).then(n.t.bind(n,24612,19)),"~blog/blog/blog-archive-80c.json",24612],"45c9e308":[()=>n.e(7084).then(n.bind(n,53181)),"@site/cpp/07-exceptions-and-raii/2023-11-24-placeholders.md",53181],"4621632b":[()=>n.e(3519).then(n.t.bind(n,29760,19)),"~blog/blog/blog-tags-cpp-7c7-list.json",29760],"48b268a6":[()=>n.e(1648).then(n.t.bind(n,35067,19)),"~docs/c/category-c-autogeneratedbar-category-bonuses-216.json",35067],"493c0536":[()=>n.e(7292).then(n.bind(n,45594)),"@site/algorithms/11-paths/2024-01-01-bf-to-astar/03-astar.md",45594],"4e546705":[()=>n.e(4327).then(n.t.bind(n,61795,19)),"~docs/c/version-current-metadata-prop-751.json",61795],"4edd2021":[()=>n.e(5975).then(n.t.bind(n,21705,19)),"~blog/blog/blog-tags-cpp-7c7.json",21705],"4f96b16e":[()=>n.e(6306).then(n.bind(n,24693)),"@site/c/pexam/c-cams.md",24693],"4fd4011a":[()=>n.e(565).then(n.t.bind(n,43050,19)),"~blog/blog/blog-tags-lts-b6c-list.json",43050],51624505:[()=>n.e(4394).then(n.bind(n,32609)),"@site/blog/aoc-2022/00-intro.md",32609],"520f8175":[()=>n.e(8058).then(n.t.bind(n,24353,19)),"~docs/algorithms/tag-algorithms-tags-cpp-0d2.json",24353],"52f2a5bf":[()=>n.e(5430).then(n.t.bind(n,61387,19)),"~blog/blog/blog-tags-red-hat-df4.json",61387],"534d4833":[()=>n.e(9771).then(n.bind(n,93019)),"@site/algorithms/02-algorithms-correctness/2021-03-18-postcondition-ambiguity.md",93019],"57ac6224":[()=>n.e(2698).then(n.t.bind(n,35340,19)),"~blog/blog/blog-tags-linux-distributions-991-list.json",35340],"595c7293":[()=>n.e(5634).then(n.bind(n,58396)),"@site/c/bonuses/08.md",58396],"5c15401e":[()=>n.e(9579).then(n.t.bind(n,43761,19)),"~docs/algorithms/tag-algorithms-tags-bellman-ford-731.json",43761],"5ca803d2":[()=>n.e(9173).then(n.t.bind(n,24890,19)),"/home/runner/work/blog/blog/.docusaurus/docusaurus-plugin-content-docs/c/plugin-route-context-module-100.json",24890],"5e91a78c":[()=>n.e(1033).then(n.t.bind(n,82547,19)),"~blog/blog/blog-tags-linux-distributions-991.json",82547],"5e95c892":[()=>n.e(9661).then(n.bind(n,41892)),"@theme/DocsRoot",41892],"5e9f5e1a":[()=>Promise.resolve().then(n.bind(n,36809)),"@generated/docusaurus.config",36809],"62d847b3":[()=>n.e(8520).then(n.t.bind(n,91901,19)),"~blog/blog/blog-tags-advent-of-code-2022-3db-list.json",91901],"66d5ef6c":[()=>n.e(9228).then(n.t.bind(n,4087,19)),"~blog/blog/blog-tags-tags-4c2.json",4087],"686a7a89":[()=>n.e(728).then(n.t.bind(n,77507,19)),"~docs/algorithms/tag-algorithms-tags-graphs-31d.json",77507],"6875c492":[()=>Promise.all([n.e(532),n.e(9365),n.e(130),n.e(8610)]).then(n.bind(n,41714)),"@theme/BlogTagsPostsPage",41714],"698e2076":[()=>n.e(3713).then(n.bind(n,38961)),"@site/algorithms/11-paths/2024-01-01-bf-to-astar/02-dijkstra.md",38961],"6bc697d0":[()=>n.e(5287).then(n.t.bind(n,68529,19)),"/home/runner/work/blog/blog/.docusaurus/docusaurus-plugin-content-docs/cpp/plugin-route-context-module-100.json",68529],"6e3cbca1":[()=>n.e(3276).then(n.t.bind(n,29538,19)),"~docs/algorithms/version-current-metadata-prop-751.json",29538],"7052c0bc":[()=>n.e(9731).then(n.bind(n,42286)),"@site/cpp/00-intro.md",42286],"70a4540f":[()=>n.e(9249).then(n.bind(n,44493)),"@site/algorithms/04-recursion/2023-08-17-pyramid-slide-down/01-naive.md",44493],"75cccf44":[()=>n.e(4256).then(n.bind(n,98215)),"@site/blog/leetcode/sort-matrix-diagonally.md?truncated=true",98215],"765ea78b":[()=>n.e(3039).then(n.t.bind(n,83010,19)),"~blog/blog/blog-tags-\ud83c\udfed-551.json",83010],"769debb9":[()=>n.e(9931).then(n.t.bind(n,33792,19)),"~blog/blog/blog-tags-paywall-11b-list.json",33792],"794ef108":[()=>n.e(3803).then(n.bind(n,86427)),"@site/c/00-intro.md",86427],"7a5bb070":[()=>n.e(4582).then(n.t.bind(n,64863,19)),"~blog/blog/blog-tags-memory-safety-1ae.json",64863],"7ce7faac":[()=>n.e(6064).then(n.t.bind(n,12884,19)),"~docs/algorithms/tag-algorithms-tags-solution-61b.json",12884],"7e6d325b":[()=>n.e(3184).then(n.t.bind(n,26139,19)),"~docs/cpp/version-current-metadata-prop-751.json",26139],"84d1e0d8":[()=>n.e(1885).then(n.bind(n,49713)),"@site/algorithms/00-intro.md",49713],"86cd1460":[()=>n.e(1235).then(n.t.bind(n,38968,19)),"~blog/blog/blog-tags-leetcode-042.json",38968],"8a25f659":[()=>n.e(7728).then(n.bind(n,73212)),"@site/algorithms/04-recursion/2023-08-17-pyramid-slide-down/03-top-down-dp.md",73212],"8b1802c5":[()=>n.e(8480).then(n.t.bind(n,60832,19)),"~blog/blog/blog-tags-advent-of-code-49f-list.json",60832],"8c0e532b":[()=>n.e(822).then(n.t.bind(n,73968,19)),"~blog/blog/blog-tags-vps-843.json",73968],"8d31a880":[()=>n.e(9066).then(n.t.bind(n,72232,19)),"~docs/algorithms/tag-algorithms-tags-python-48f.json",72232],"8e6bb954":[()=>n.e(5775).then(n.t.bind(n,76206,19)),"~docs/algorithms/tag-algorithms-tags-exponential-60a.json",76206],"9287eafd":[()=>n.e(5521).then(n.t.bind(n,90716,19)),"~blog/blog/blog-tags-rust-0c9.json",90716],"933b95b3":[()=>n.e(3887).then(n.t.bind(n,7405,19)),"~docs/algorithms/category-algorithms-autogeneratedbar-category-recursion-257.json",7405],"947341b7":[()=>n.e(1145).then(n.t.bind(n,2897,19)),"~docs/algorithms/tag-algorithms-tags-bfs-69f.json",2897],"95b96bb9":[()=>n.e(3561).then(n.t.bind(n,24577,19)),"~blog/blog/blog-post-list-prop-blog.json",24577],"95f41f0b":[()=>n.e(9385).then(n.bind(n,93195)),"@site/blog/aoc-2022/01-week-1.md?truncated=true",93195],"962da50c":[()=>n.e(2264).then(n.t.bind(n,9705,19)),"~docs/c/category-c-autogeneratedbar-category-practice-exams-e97.json",9705],"976c4f3b":[()=>n.e(4562).then(n.t.bind(n,69019,19)),"~docs/algorithms/tag-algorithms-tags-java-6c3.json",69019],"97a42631":[()=>n.e(1464).then(n.t.bind(n,77343,19)),"~docs/algorithms/tags-list-current-prop-15a.json",77343],"9a3dc578":[()=>n.e(655).then(n.t.bind(n,9916,19)),"~docs/algorithms/tag-algorithms-tags-dynamic-array-5d3.json",9916],"9ad42b04":[()=>n.e(8041).then(n.t.bind(n,41271,19)),"~blog/blog/blog-tags-devconf-979-list.json",41271],"9b91a88c":[()=>n.e(2545).then(n.bind(n,19466)),"@site/algorithms/04-recursion/2023-08-17-pyramid-slide-down/index.md",19466],"9df0e937":[()=>n.e(2210).then(n.t.bind(n,55256,19)),"~docs/algorithms/category-algorithms-autogeneratedbar-category-graphs-2e2.json",55256],"9e4087bc":[()=>n.e(3608).then(n.bind(n,63169)),"@theme/BlogArchivePage",63169],a082abd3:[()=>n.e(8786).then(n.t.bind(n,73276,19)),"~blog/blog/blog-tags-admin-b05.json",73276],a2ba8888:[()=>n.e(8289).then(n.t.bind(n,55941,19)),"~docs/algorithms/tag-algorithms-tags-brute-force-3cb.json",55941],a4c10cf4:[()=>n.e(4382).then(n.t.bind(n,30685,19)),"~docs/algorithms/tag-algorithms-tags-time-complexity-c50.json",30685],a6a48ea2:[()=>n.e(3618).then(n.bind(n,1176)),"@site/blog/aoc-2022/02-week-2.md?truncated=true",1176],a6aa9e1f:[()=>Promise.all([n.e(532),n.e(9365),n.e(130),n.e(3089)]).then(n.bind(n,80046)),"@theme/BlogListPage",80046],a7098721:[()=>n.e(1050).then(n.t.bind(n,26615,19)),"~blog/blog/blog-c06.json",26615],a7bd4aaa:[()=>n.e(8518).then(n.bind(n,8564)),"@theme/DocVersionRoot",8564],a80747a0:[()=>n.e(5824).then(n.t.bind(n,4464,19)),"~blog/blog/blog-tags-advent-of-code-2022-3db.json",4464],a94703ab:[()=>Promise.all([n.e(532),n.e(4368)]).then(n.bind(n,12674)),"@theme/DocRoot",12674],aa24fd5d:[()=>n.e(7257).then(n.bind(n,90251)),"@site/algorithms/12-hash-tables/2023-11-28-breaking/01-python.md",90251],aa635a28:[()=>n.e(2321).then(n.bind(n,72820)),"@site/blog/2024-02-07-lts-distros.md?truncated=true",72820],ab2721d4:[()=>n.e(7755).then(n.bind(n,53037)),"@site/blog/aoc-2022/04-week-4.md?truncated=true",53037],af8b72a7:[()=>n.e(5658).then(n.bind(n,10507)),"@site/blog/2023-08-02-copr.md?truncated=true",10507],b0291f37:[()=>n.e(6097).then(n.t.bind(n,7085,19)),"/home/runner/work/blog/blog/.docusaurus/docusaurus-theme-search-algolia/default/plugin-route-context-module-100.json",7085],b1288602:[()=>n.e(59).then(n.bind(n,51456)),"@site/c/mr.md",51456],b25fbc58:[()=>n.e(9197).then(n.t.bind(n,75617,19)),"~blog/blog/blog-tags-\ud83c\udfed-551-list.json",75617],b45dccf0:[()=>n.e(9679).then(n.t.bind(n,58296,19)),"~blog/blog/blog-tags-copr-70b.json",58296],b5a32f14:[()=>n.e(2433).then(n.bind(n,31976)),"@site/blog/2023-08-02-copr.md",31976],b8cbf382:[()=>n.e(7438).then(n.t.bind(n,74632,19)),"~docs/algorithms/tag-algorithms-tags-greedy-02f.json",74632],b9f7f5c4:[()=>n.e(9179).then(n.bind(n,76699)),"@site/cpp/environment.md",76699],bb882650:[()=>n.e(8091).then(n.bind(n,66765)),"@site/blog/aoc-2022/03-week-3.md?truncated=true",66765],bb984793:[()=>n.e(6864).then(n.t.bind(n,82505,19)),"~docs/algorithms/tag-algorithms-tags-karel-df7.json",82505],bc0c9d90:[()=>n.e(354).then(n.bind(n,50476)),"@site/c/bonuses/04.md",50476],bc2d22bc:[()=>n.e(6519).then(n.t.bind(n,70428,19)),"~docs/algorithms/tag-algorithms-tags-bottom-up-dp-4f9.json",70428],c4c4056e:[()=>n.e(635).then(n.bind(n,61381)),"@site/algorithms/11-paths/2024-01-01-bf-to-astar/index.md",61381],c4f5d8e4:[()=>Promise.all([n.e(532),n.e(4195)]).then(n.bind(n,53261)),"@site/src/pages/index.js",53261],c580b66a:[()=>n.e(6573).then(n.t.bind(n,45021,19)),"~docs/algorithms/tag-algorithms-tags-top-down-dp-c2f.json",45021],c90b7ff3:[()=>n.e(3602).then(n.t.bind(n,44960,19)),"~blog/blog/blog-tags-hype-1ee-list.json",44960],ccc49370:[()=>Promise.all([n.e(532),n.e(9365),n.e(130),n.e(6103)]).then(n.bind(n,65203)),"@theme/BlogPostPage",65203],cfa2b263:[()=>n.e(3086).then(n.bind(n,34437)),"@site/blog/leetcode/sort-matrix-diagonally.md",34437],d05e838c:[()=>n.e(6544).then(n.bind(n,63004)),"@site/c/bonuses/05-06.md",63004],d255bd7f:[()=>n.e(6292).then(n.t.bind(n,60341,19)),"~docs/algorithms/tag-algorithms-tags-red-black-trees-c61.json",60341],d309b5b1:[()=>n.e(8908).then(n.t.bind(n,26102,19)),"~docs/algorithms/category-algorithms-autogeneratedbar-category-algorithms-and-correctness-d51.json",26102],d309eaf6:[()=>n.e(6995).then(n.bind(n,54506)),"@site/blog/2024-02-07-lts-distros.md",54506],d4b1e057:[()=>n.e(1492).then(n.t.bind(n,12842,19)),"~docs/algorithms/tag-algorithms-tags-balanced-trees-b3e.json",12842],d57b4369:[()=>n.e(6179).then(n.t.bind(n,52715,19)),"~docs/algorithms/tag-algorithms-tags-csharp-d1d.json",52715],d675395f:[()=>n.e(2741).then(n.t.bind(n,15745,19)),"/home/runner/work/blog/blog/.docusaurus/docusaurus-plugin-content-pages/default/plugin-route-context-module-100.json",15745],d79dd549:[()=>n.e(5169).then(n.t.bind(n,29261,19)),"~blog/blog/blog-tags-red-hat-df4-list.json",29261],d7f7fb17:[()=>n.e(1171).then(n.bind(n,3455)),"@site/blog/aoc-2022/00-intro.md?truncated=true",3455],d8f4410e:[()=>n.e(2997).then(n.t.bind(n,41941,19)),"~docs/algorithms/tag-algorithms-tags-hash-tables-b36.json",41941],dd841e73:[()=>n.e(2482).then(n.t.bind(n,40155,19)),"~docs/algorithms/tag-algorithms-tags-dynamic-programming-3e6.json",40155],ddc7679f:[()=>n.e(569).then(n.bind(n,64322)),"@site/algorithms/10-graphs/2021-05-18-iterative-and-iterators.md",64322],dead8108:[()=>n.e(8807).then(n.bind(n,21431)),"@site/c/bonuses/03.md",21431],decbf9d1:[()=>n.e(2445).then(n.t.bind(n,88876,19)),"~docs/algorithms/category-algorithms-autogeneratedbar-category-asymptotic-notation-and-time-complexity-e0d.json",88876],df078f58:[()=>n.e(7743).then(n.t.bind(n,88298,19)),"~docs/algorithms/category-algorithms-autogeneratedbar-category-paths-in-graphs-202.json",88298],df0885f0:[()=>n.e(4343).then(n.t.bind(n,34175,19)),"~docs/algorithms/tag-algorithms-tags-iterators-13a.json",34175],df203c0f:[()=>Promise.all([n.e(532),n.e(9924)]).then(n.bind(n,40491)),"@theme/DocTagDocListPage",40491],dff2ebad:[()=>n.e(146).then(n.bind(n,42492)),"@site/blog/aoc-2022/01-week-1.md",42492],e0f0fdef:[()=>n.e(8404).then(n.t.bind(n,25240,19)),"~blog/blog/blog-tags-conferences-51c.json",25240],e1717305:[()=>n.e(70).then(n.t.bind(n,96377,19)),"~blog/blog/blog-tags-support-474-list.json",96377],e1d2ae23:[()=>n.e(1475).then(n.t.bind(n,36302,19)),"~docs/algorithms/tag-algorithms-tags-applications-020.json",36302],e31003e9:[()=>n.e(1960).then(n.t.bind(n,81695,19)),"~docs/cpp/category-cpp-autogeneratedbar-category-exceptions-and-raii-6e9.json",81695],e89da83e:[()=>n.e(8757).then(n.t.bind(n,97416,19)),"~blog/blog/blog-tags-leetcode-042-list.json",97416],eba2374c:[()=>n.e(8387).then(n.t.bind(n,47662,19)),"~docs/algorithms/tag-algorithms-tags-backtracking-bb2.json",47662],f44abc07:[()=>n.e(2204).then(n.t.bind(n,39006,19)),"~blog/blog/blog-tags-cult-e97.json",39006],f48be158:[()=>n.e(4064).then(n.bind(n,12326)),"@site/blog/aoc-2022/03-week-3.md",12326],f5d65bd1:[()=>n.e(5617).then(n.t.bind(n,66775,19)),"~blog/blog/blog-tags-support-474.json",66775],f7189688:[()=>n.e(4625).then(n.t.bind(n,9843,19)),"~blog/blog/blog-tags-conferences-51c-list.json",9843],f71d1f68:[()=>n.e(6069).then(n.bind(n,13068)),"@site/blog/2024-01-28-rust-opinion.md",13068],f75910c4:[()=>n.e(5934).then(n.bind(n,1910)),"@site/algorithms/04-recursion/2023-08-17-pyramid-slide-down/02-greedy.md",1910],f7d29e9b:[()=>n.e(7959).then(n.t.bind(n,89266,19)),"~blog/blog/blog-tags-memory-safety-1ae-list.json",89266],fb4361d3:[()=>n.e(6327).then(n.t.bind(n,9631,19)),"~docs/algorithms/category-algorithms-autogeneratedbar-category-red-black-trees-d8a.json",9631],ff472cd9:[()=>n.e(8643).then(n.t.bind(n,7122,19)),"~blog/blog/blog-tags-iterators-977.json",7122],ff82dde7:[()=>Promise.all([n.e(532),n.e(8472)]).then(n.bind(n,63935)),"@site/algorithms/08-rb-trees/2023-06-10-rules.md",63935]};var s=n(85893);function l(e){let{error:t,retry:n,pastDelay:r}=e;return t?(0,s.jsxs)("div",{style:{textAlign:"center",color:"#fff",backgroundColor:"#fa383e",borderColor:"#fa383e",borderStyle:"solid",borderRadius:"0.25rem",borderWidth:"1px",boxSizing:"border-box",display:"block",padding:"1rem",flex:"0 0 50%",marginLeft:"25%",marginRight:"25%",marginTop:"5rem",maxWidth:"50%",width:"100%"},children:[(0,s.jsx)("p",{children:String(t)}),(0,s.jsx)("div",{children:(0,s.jsx)("button",{type:"button",onClick:n,children:"Retry"})})]}):r?(0,s.jsx)("div",{style:{display:"flex",justifyContent:"center",alignItems:"center",height:"100vh"},children:(0,s.jsx)("svg",{id:"loader",style:{width:128,height:110,position:"absolute",top:"calc(100vh - 64%)"},viewBox:"0 0 45 45",xmlns:"http://www.w3.org/2000/svg",stroke:"#61dafb",children:(0,s.jsxs)("g",{fill:"none",fillRule:"evenodd",transform:"translate(1 1)",strokeWidth:"2",children:[(0,s.jsxs)("circle",{cx:"22",cy:"22",r:"6",strokeOpacity:"0",children:[(0,s.jsx)("animate",{attributeName:"r",begin:"1.5s",dur:"3s",values:"6;22",calcMode:"linear",repeatCount:"indefinite"}),(0,s.jsx)("animate",{attributeName:"stroke-opacity",begin:"1.5s",dur:"3s",values:"1;0",calcMode:"linear",repeatCount:"indefinite"}),(0,s.jsx)("animate",{attributeName:"stroke-width",begin:"1.5s",dur:"3s",values:"2;0",calcMode:"linear",repeatCount:"indefinite"})]}),(0,s.jsxs)("circle",{cx:"22",cy:"22",r:"6",strokeOpacity:"0",children:[(0,s.jsx)("animate",{attributeName:"r",begin:"3s",dur:"3s",values:"6;22",calcMode:"linear",repeatCount:"indefinite"}),(0,s.jsx)("animate",{attributeName:"stroke-opacity",begin:"3s",dur:"3s",values:"1;0",calcMode:"linear",repeatCount:"indefinite"}),(0,s.jsx)("animate",{attributeName:"stroke-width",begin:"3s",dur:"3s",values:"2;0",calcMode:"linear",repeatCount:"indefinite"})]}),(0,s.jsx)("circle",{cx:"22",cy:"22",r:"8",children:(0,s.jsx)("animate",{attributeName:"r",begin:"0s",dur:"1.5s",values:"6;1;2;3;4;5;6",calcMode:"linear",repeatCount:"indefinite"})})]})})}):null}var c=n(99670),u=n(30226);function d(e,t){if("*"===e)return a()({loading:l,loader:()=>n.e(1772).then(n.bind(n,51772)),modules:["@theme/NotFound"],webpack:()=>[51772],render(e,t){const n=e.default;return(0,s.jsx)(u.z,{value:{plugin:{name:"native",id:"default"}},children:(0,s.jsx)(n,{...t})})}});const r=o[`${e}-${t}`],d={},p=[],f=[],g=(0,c.Z)(r);return Object.entries(g).forEach((e=>{let[t,n]=e;const r=i[n];r&&(d[t]=r[0],p.push(r[1]),f.push(r[2]))})),a().Map({loading:l,loader:d,modules:p,webpack:()=>f,render(t,n){const a=JSON.parse(JSON.stringify(r));Object.entries(t).forEach((t=>{let[n,r]=t;const o=r.default;if(!o)throw new Error(`The page component at ${e} doesn't have a default export. This makes it impossible to render anything. Consider default-exporting a React component.`);"object"!=typeof o&&"function"!=typeof o||Object.keys(r).filter((e=>"default"!==e)).forEach((e=>{o[e]=r[e]}));let i=a;const s=n.split(".");s.slice(0,-1).forEach((e=>{i=i[e]})),i[s[s.length-1]]=o}));const o=a.__comp;delete a.__comp;const i=a.__context;return delete a.__context,(0,s.jsx)(u.z,{value:i,children:(0,s.jsx)(o,{...a,...n})})}})}const p=[{path:"/blog/",component:d("/blog/","f83"),exact:!0},{path:"/blog/2023/08/02/copr/",component:d("/blog/2023/08/02/copr/","69d"),exact:!0},{path:"/blog/2024/01/28/rust-opinion/",component:d("/blog/2024/01/28/rust-opinion/","98d"),exact:!0},{path:"/blog/2024/02/07/lts-distros/",component:d("/blog/2024/02/07/lts-distros/","7f0"),exact:!0},{path:"/blog/2024/06/19/devconf-2024/",component:d("/blog/2024/06/19/devconf-2024/","427"),exact:!0},{path:"/blog/aoc-2022/1st-week/",component:d("/blog/aoc-2022/1st-week/","df4"),exact:!0},{path:"/blog/aoc-2022/2nd-week/",component:d("/blog/aoc-2022/2nd-week/","783"),exact:!0},{path:"/blog/aoc-2022/3rd-week/",component:d("/blog/aoc-2022/3rd-week/","7c5"),exact:!0},{path:"/blog/aoc-2022/4th-week/",component:d("/blog/aoc-2022/4th-week/","1ac"),exact:!0},{path:"/blog/aoc-2022/intro/",component:d("/blog/aoc-2022/intro/","ada"),exact:!0},{path:"/blog/archive/",component:d("/blog/archive/","22d"),exact:!0},{path:"/blog/leetcode/sort-diagonally/",component:d("/blog/leetcode/sort-diagonally/","d97"),exact:!0},{path:"/blog/tags/",component:d("/blog/tags/","f23"),exact:!0},{path:"/blog/tags/\ud83c\udfed/",component:d("/blog/tags/\ud83c\udfed/","381"),exact:!0},{path:"/blog/tags/admin/",component:d("/blog/tags/admin/","d3a"),exact:!0},{path:"/blog/tags/advent-of-code-2022/",component:d("/blog/tags/advent-of-code-2022/","7bd"),exact:!0},{path:"/blog/tags/advent-of-code/",component:d("/blog/tags/advent-of-code/","313"),exact:!0},{path:"/blog/tags/conferences/",component:d("/blog/tags/conferences/","77a"),exact:!0},{path:"/blog/tags/copr/",component:d("/blog/tags/copr/","959"),exact:!0},{path:"/blog/tags/cpp/",component:d("/blog/tags/cpp/","770"),exact:!0},{path:"/blog/tags/cult/",component:d("/blog/tags/cult/","73d"),exact:!0},{path:"/blog/tags/devconf/",component:d("/blog/tags/devconf/","602"),exact:!0},{path:"/blog/tags/fedora/",component:d("/blog/tags/fedora/","c8d"),exact:!0},{path:"/blog/tags/hype/",component:d("/blog/tags/hype/","d35"),exact:!0},{path:"/blog/tags/iterators/",component:d("/blog/tags/iterators/","2eb"),exact:!0},{path:"/blog/tags/leetcode/",component:d("/blog/tags/leetcode/","e31"),exact:!0},{path:"/blog/tags/linux-distributions/",component:d("/blog/tags/linux-distributions/","2be"),exact:!0},{path:"/blog/tags/lts/",component:d("/blog/tags/lts/","fa3"),exact:!0},{path:"/blog/tags/memory-safety/",component:d("/blog/tags/memory-safety/","d15"),exact:!0},{path:"/blog/tags/paywall/",component:d("/blog/tags/paywall/","9e7"),exact:!0},{path:"/blog/tags/red-hat/",component:d("/blog/tags/red-hat/","9b4"),exact:!0},{path:"/blog/tags/rust/",component:d("/blog/tags/rust/","bfd"),exact:!0},{path:"/blog/tags/support/",component:d("/blog/tags/support/","5f8"),exact:!0},{path:"/blog/tags/vps/",component:d("/blog/tags/vps/","1b8"),exact:!0},{path:"/contributions/",component:d("/contributions/","541"),exact:!0},{path:"/search/",component:d("/search/","c7b"),exact:!0},{path:"/talks/",component:d("/talks/","819"),exact:!0},{path:"/algorithms/",component:d("/algorithms/","c61"),routes:[{path:"/algorithms/",component:d("/algorithms/","b39"),routes:[{path:"/algorithms/tags/",component:d("/algorithms/tags/","bb8"),exact:!0},{path:"/algorithms/tags/a-star/",component:d("/algorithms/tags/a-star/","83e"),exact:!0},{path:"/algorithms/tags/applications/",component:d("/algorithms/tags/applications/","b32"),exact:!0},{path:"/algorithms/tags/astar/",component:d("/algorithms/tags/astar/","08b"),exact:!0},{path:"/algorithms/tags/backtracking/",component:d("/algorithms/tags/backtracking/","e2d"),exact:!0},{path:"/algorithms/tags/balanced-trees/",component:d("/algorithms/tags/balanced-trees/","591"),exact:!0},{path:"/algorithms/tags/bellman-ford/",component:d("/algorithms/tags/bellman-ford/","2bc"),exact:!0},{path:"/algorithms/tags/bfs/",component:d("/algorithms/tags/bfs/","334"),exact:!0},{path:"/algorithms/tags/bottom-up-dp/",component:d("/algorithms/tags/bottom-up-dp/","9e5"),exact:!0},{path:"/algorithms/tags/brute-force/",component:d("/algorithms/tags/brute-force/","99b"),exact:!0},{path:"/algorithms/tags/c/",component:d("/algorithms/tags/c/","cc5"),exact:!0},{path:"/algorithms/tags/cpp/",component:d("/algorithms/tags/cpp/","f5b"),exact:!0},{path:"/algorithms/tags/csharp/",component:d("/algorithms/tags/csharp/","7a9"),exact:!0},{path:"/algorithms/tags/dijkstra/",component:d("/algorithms/tags/dijkstra/","aa8"),exact:!0},{path:"/algorithms/tags/dynamic-array/",component:d("/algorithms/tags/dynamic-array/","00e"),exact:!0},{path:"/algorithms/tags/dynamic-programming/",component:d("/algorithms/tags/dynamic-programming/","f82"),exact:!0},{path:"/algorithms/tags/exponential/",component:d("/algorithms/tags/exponential/","a74"),exact:!0},{path:"/algorithms/tags/graphs/",component:d("/algorithms/tags/graphs/","d5b"),exact:!0},{path:"/algorithms/tags/greedy/",component:d("/algorithms/tags/greedy/","079"),exact:!0},{path:"/algorithms/tags/hash-tables/",component:d("/algorithms/tags/hash-tables/","ae4"),exact:!0},{path:"/algorithms/tags/iterative/",component:d("/algorithms/tags/iterative/","783"),exact:!0},{path:"/algorithms/tags/iterators/",component:d("/algorithms/tags/iterators/","1bc"),exact:!0},{path:"/algorithms/tags/java/",component:d("/algorithms/tags/java/","2b4"),exact:!0},{path:"/algorithms/tags/karel/",component:d("/algorithms/tags/karel/","79f"),exact:!0},{path:"/algorithms/tags/postconditions/",component:d("/algorithms/tags/postconditions/","a27"),exact:!0},{path:"/algorithms/tags/python/",component:d("/algorithms/tags/python/","eb2"),exact:!0},{path:"/algorithms/tags/recursion/",component:d("/algorithms/tags/recursion/","2b0"),exact:!0},{path:"/algorithms/tags/red-black-trees/",component:d("/algorithms/tags/red-black-trees/","9ca"),exact:!0},{path:"/algorithms/tags/solution/",component:d("/algorithms/tags/solution/","fa0"),exact:!0},{path:"/algorithms/tags/sorting/",component:d("/algorithms/tags/sorting/","7ca"),exact:!0},{path:"/algorithms/tags/testing/",component:d("/algorithms/tags/testing/","2af"),exact:!0},{path:"/algorithms/tags/time-complexity/",component:d("/algorithms/tags/time-complexity/","2d3"),exact:!0},{path:"/algorithms/tags/top-down-dp/",component:d("/algorithms/tags/top-down-dp/","779"),exact:!0},{path:"/algorithms/",component:d("/algorithms/","b7c"),routes:[{path:"/algorithms/",component:d("/algorithms/","9b0"),exact:!0,sidebar:"autogeneratedBar"},{path:"/algorithms/algorithms-correctness/postcondition-ambiguity/",component:d("/algorithms/algorithms-correctness/postcondition-ambiguity/","c18"),exact:!0,sidebar:"autogeneratedBar"},{path:"/algorithms/category/algorithms-and-correctness/",component:d("/algorithms/category/algorithms-and-correctness/","ea2"),exact:!0,sidebar:"autogeneratedBar"},{path:"/algorithms/category/asymptotic-notation-and-time-complexity/",component:d("/algorithms/category/asymptotic-notation-and-time-complexity/","fba"),exact:!0,sidebar:"autogeneratedBar"},{path:"/algorithms/category/graphs/",component:d("/algorithms/category/graphs/","a92"),exact:!0,sidebar:"autogeneratedBar"},{path:"/algorithms/category/hash-tables/",component:d("/algorithms/category/hash-tables/","ddd"),exact:!0,sidebar:"autogeneratedBar"},{path:"/algorithms/category/paths-in-graphs/",component:d("/algorithms/category/paths-in-graphs/","7c7"),exact:!0,sidebar:"autogeneratedBar"},{path:"/algorithms/category/recursion/",component:d("/algorithms/category/recursion/","61f"),exact:!0,sidebar:"autogeneratedBar"},{path:"/algorithms/category/red-black-trees/",component:d("/algorithms/category/red-black-trees/","0c0"),exact:!0,sidebar:"autogeneratedBar"},{path:"/algorithms/graphs/bfs-tree/",component:d("/algorithms/graphs/bfs-tree/","2fb"),exact:!0,sidebar:"autogeneratedBar"},{path:"/algorithms/graphs/iterative-and-iterators/",component:d("/algorithms/graphs/iterative-and-iterators/","bfd"),exact:!0,sidebar:"autogeneratedBar"},{path:"/algorithms/hash-tables/breaking/",component:d("/algorithms/hash-tables/breaking/","319"),exact:!0,sidebar:"autogeneratedBar"},{path:"/algorithms/hash-tables/breaking/mitigations/",component:d("/algorithms/hash-tables/breaking/mitigations/","4c2"),exact:!0,sidebar:"autogeneratedBar"},{path:"/algorithms/hash-tables/breaking/python/",component:d("/algorithms/hash-tables/breaking/python/","3d1"),exact:!0,sidebar:"autogeneratedBar"},{path:"/algorithms/paths/bf-to-astar/",component:d("/algorithms/paths/bf-to-astar/","050"),exact:!0,sidebar:"autogeneratedBar"},{path:"/algorithms/paths/bf-to-astar/astar/",component:d("/algorithms/paths/bf-to-astar/astar/","b4d"),exact:!0,sidebar:"autogeneratedBar"},{path:"/algorithms/paths/bf-to-astar/bf/",component:d("/algorithms/paths/bf-to-astar/bf/","e9c"),exact:!0,sidebar:"autogeneratedBar"},{path:"/algorithms/paths/bf-to-astar/dijkstra/",component:d("/algorithms/paths/bf-to-astar/dijkstra/","fe4"),exact:!0,sidebar:"autogeneratedBar"},{path:"/algorithms/rb-trees/applications/",component:d("/algorithms/rb-trees/applications/","46a"),exact:!0,sidebar:"autogeneratedBar"},{path:"/algorithms/rb-trees/rules/",component:d("/algorithms/rb-trees/rules/","21a"),exact:!0,sidebar:"autogeneratedBar"},{path:"/algorithms/recursion/karel/",component:d("/algorithms/recursion/karel/","4cf"),exact:!0,sidebar:"autogeneratedBar"},{path:"/algorithms/recursion/karel/solution/",component:d("/algorithms/recursion/karel/solution/","115"),exact:!0,sidebar:"autogeneratedBar"},{path:"/algorithms/recursion/pyramid-slide-down/",component:d("/algorithms/recursion/pyramid-slide-down/","236"),exact:!0,sidebar:"autogeneratedBar"},{path:"/algorithms/recursion/pyramid-slide-down/bottom-up-dp/",component:d("/algorithms/recursion/pyramid-slide-down/bottom-up-dp/","00d"),exact:!0,sidebar:"autogeneratedBar"},{path:"/algorithms/recursion/pyramid-slide-down/greedy/",component:d("/algorithms/recursion/pyramid-slide-down/greedy/","4bf"),exact:!0,sidebar:"autogeneratedBar"},{path:"/algorithms/recursion/pyramid-slide-down/naive/",component:d("/algorithms/recursion/pyramid-slide-down/naive/","c1b"),exact:!0,sidebar:"autogeneratedBar"},{path:"/algorithms/recursion/pyramid-slide-down/top-down-dp/",component:d("/algorithms/recursion/pyramid-slide-down/top-down-dp/","fe9"),exact:!0,sidebar:"autogeneratedBar"},{path:"/algorithms/time-complexity/extend/",component:d("/algorithms/time-complexity/extend/","250"),exact:!0,sidebar:"autogeneratedBar"}]}]}]},{path:"/c/",component:d("/c/","dae"),routes:[{path:"/c/",component:d("/c/","fc8"),routes:[{path:"/c/",component:d("/c/","1c4"),routes:[{path:"/c/",component:d("/c/","a0f"),exact:!0,sidebar:"autogeneratedBar"},{path:"/c/bonuses/seminar-03/",component:d("/c/bonuses/seminar-03/","aaa"),exact:!0,sidebar:"autogeneratedBar"},{path:"/c/bonuses/seminar-04/",component:d("/c/bonuses/seminar-04/","ffe"),exact:!0,sidebar:"autogeneratedBar"},{path:"/c/bonuses/seminar-05-06/",component:d("/c/bonuses/seminar-05-06/","4cd"),exact:!0,sidebar:"autogeneratedBar"},{path:"/c/bonuses/seminar-08/",component:d("/c/bonuses/seminar-08/","09a"),exact:!0,sidebar:"autogeneratedBar"},{path:"/c/bonuses/seminar-10/",component:d("/c/bonuses/seminar-10/","b9e"),exact:!0,sidebar:"autogeneratedBar"},{path:"/c/category/bonuses/",component:d("/c/category/bonuses/","17e"),exact:!0,sidebar:"autogeneratedBar"},{path:"/c/category/practice-exams/",component:d("/c/category/practice-exams/","009"),exact:!0,sidebar:"autogeneratedBar"},{path:"/c/mr/",component:d("/c/mr/","4c5"),exact:!0,sidebar:"autogeneratedBar"},{path:"/c/pexam/cams/",component:d("/c/pexam/cams/","a10"),exact:!0,sidebar:"autogeneratedBar"},{path:"/c/pexam/garbage_collect/",component:d("/c/pexam/garbage_collect/","44e"),exact:!0,sidebar:"autogeneratedBar"}]}]}]},{path:"/cpp/",component:d("/cpp/","269"),routes:[{path:"/cpp/",component:d("/cpp/","187"),routes:[{path:"/cpp/",component:d("/cpp/","102"),routes:[{path:"/cpp/",component:d("/cpp/","fcd"),exact:!0,sidebar:"autogeneratedBar"},{path:"/cpp/category/exceptions-and-raii/",component:d("/cpp/category/exceptions-and-raii/","cfa"),exact:!0,sidebar:"autogeneratedBar"},{path:"/cpp/environment/",component:d("/cpp/environment/","e0b"),exact:!0,sidebar:"autogeneratedBar"},{path:"/cpp/exceptions-and-raii/placeholders/",component:d("/cpp/exceptions-and-raii/placeholders/","9b3"),exact:!0,sidebar:"autogeneratedBar"}]}]}]},{path:"/",component:d("/","dfb"),exact:!0},{path:"*",component:d("*")}]},98934:(e,t,n)=>{"use strict";n.d(t,{_:()=>o,t:()=>i});var r=n(67294),a=n(85893);const o=r.createContext(!1);function i(e){let{children:t}=e;const[n,i]=(0,r.useState)(!1);return(0,r.useEffect)((()=>{i(!0)}),[]),(0,a.jsx)(o.Provider,{value:n,children:t})}},97221:(e,t,n)=>{"use strict";var r=n(67294),a=n(20745),o=n(73727),i=n(70405),s=n(10412);const l=[n(32497),n(3310),n(18320),n(29268),n(7439)];var c=n(723),u=n(16550),d=n(18790),p=n(85893);function f(e){let{children:t}=e;return(0,p.jsx)(p.Fragment,{children:t})}var g=n(35742),h=n(52263),m=n(44996),b=n(86668),y=n(10833),v=n(94711),w=n(19727),k=n(43320),x=n(18780),S=n(90197);function _(){const{i18n:{currentLocale:e,defaultLocale:t,localeConfigs:n}}=(0,h.Z)(),r=(0,v.l)(),a=n[e].htmlLang,o=e=>e.replace("-","_");return(0,p.jsxs)(g.Z,{children:[Object.entries(n).map((e=>{let[t,{htmlLang:n}]=e;return(0,p.jsx)("link",{rel:"alternate",href:r.createUrl({locale:t,fullyQualified:!0}),hrefLang:n},t)})),(0,p.jsx)("link",{rel:"alternate",href:r.createUrl({locale:t,fullyQualified:!0}),hrefLang:"x-default"}),(0,p.jsx)("meta",{property:"og:locale",content:o(a)}),Object.values(n).filter((e=>a!==e.htmlLang)).map((e=>(0,p.jsx)("meta",{property:"og:locale:alternate",content:o(e.htmlLang)},`meta-og-${e.htmlLang}`)))]})}function E(e){let{permalink:t}=e;const{siteConfig:{url:n}}=(0,h.Z)(),r=function(){const{siteConfig:{url:e,baseUrl:t,trailingSlash:n}}=(0,h.Z)(),{pathname:r}=(0,u.TH)();return e+(0,x.applyTrailingSlash)((0,m.Z)(r),{trailingSlash:n,baseUrl:t})}(),a=t?`${n}${t}`:r;return(0,p.jsxs)(g.Z,{children:[(0,p.jsx)("meta",{property:"og:url",content:a}),(0,p.jsx)("link",{rel:"canonical",href:a})]})}function C(){const{i18n:{currentLocale:e}}=(0,h.Z)(),{metadata:t,image:n}=(0,b.L)();return(0,p.jsxs)(p.Fragment,{children:[(0,p.jsxs)(g.Z,{children:[(0,p.jsx)("meta",{name:"twitter:card",content:"summary_large_image"}),(0,p.jsx)("body",{className:w.h})]}),n&&(0,p.jsx)(y.d,{image:n}),(0,p.jsx)(E,{}),(0,p.jsx)(_,{}),(0,p.jsx)(S.Z,{tag:k.HX,locale:e}),(0,p.jsx)(g.Z,{children:t.map(((e,t)=>(0,p.jsx)("meta",{...e},t)))})]})}const T=new Map;function A(e){if(T.has(e.pathname))return{...e,pathname:T.get(e.pathname)};if((0,d.f)(c.Z,e.pathname).some((e=>{let{route:t}=e;return!0===t.exact})))return T.set(e.pathname,e.pathname),e;const t=e.pathname.trim().replace(/(?:\/index)?\.html$/,"")||"/";return T.set(e.pathname,t),{...e,pathname:t}}var j=n(98934),N=n(58940),L=n(20469);function P(e){for(var t=arguments.length,n=new Array(t>1?t-1:0),r=1;r{const r=t.default?.[e]??t[e];return r?.(...n)}));return()=>a.forEach((e=>e?.()))}const I=function(e){let{children:t,location:n,previousLocation:r}=e;return(0,L.Z)((()=>{r!==n&&(!function(e){let{location:t,previousLocation:n}=e;if(!n)return;const r=t.pathname===n.pathname,a=t.hash===n.hash,o=t.search===n.search;if(r&&a&&!o)return;const{hash:i}=t;if(i){const e=decodeURIComponent(i.substring(1)),t=document.getElementById(e);t?.scrollIntoView()}else window.scrollTo(0,0)}({location:n,previousLocation:r}),P("onRouteDidUpdate",{previousLocation:r,location:n}))}),[r,n]),t};function R(e){const t=Array.from(new Set([e,decodeURI(e)])).map((e=>(0,d.f)(c.Z,e))).flat();return Promise.all(t.map((e=>e.route.component.preload?.())))}class O extends r.Component{previousLocation;routeUpdateCleanupCb;constructor(e){super(e),this.previousLocation=null,this.routeUpdateCleanupCb=s.Z.canUseDOM?P("onRouteUpdate",{previousLocation:null,location:this.props.location}):()=>{},this.state={nextRouteHasLoaded:!0}}shouldComponentUpdate(e,t){if(e.location===this.props.location)return t.nextRouteHasLoaded;const n=e.location;return this.previousLocation=this.props.location,this.setState({nextRouteHasLoaded:!1}),this.routeUpdateCleanupCb=P("onRouteUpdate",{previousLocation:this.previousLocation,location:n}),R(n.pathname).then((()=>{this.routeUpdateCleanupCb(),this.setState({nextRouteHasLoaded:!0})})).catch((e=>{console.warn(e),window.location.reload()})),!1}render(){const{children:e,location:t}=this.props;return(0,p.jsx)(I,{previousLocation:this.previousLocation,location:t,children:(0,p.jsx)(u.AW,{location:t,render:()=>e})})}}const F=O,M="__docusaurus-base-url-issue-banner-container",D="__docusaurus-base-url-issue-banner",B="__docusaurus-base-url-issue-banner-suggestion-container";function z(e){return`\ndocument.addEventListener('DOMContentLoaded', function maybeInsertBanner() {\n var shouldInsert = typeof window['docusaurus'] === 'undefined';\n shouldInsert && insertBanner();\n});\n\nfunction insertBanner() {\n var bannerContainer = document.createElement('div');\n bannerContainer.id = '${M}';\n var bannerHtml = ${JSON.stringify(function(e){return`\n
    \n

    Your Docusaurus site did not load properly.

    \n

    A very common reason is a wrong site baseUrl configuration.

    \n

    Current configured baseUrl = ${e} ${"/"===e?" (default value)":""}

    \n

    We suggest trying baseUrl =

    \n
    \n`}(e)).replace(/{if("undefined"==typeof document)return void n();const r=document.createElement("link");r.setAttribute("rel","prefetch"),r.setAttribute("href",e),r.onload=()=>t(),r.onerror=()=>n();const a=document.getElementsByTagName("head")[0]??document.getElementsByName("script")[0]?.parentNode;a?.appendChild(r)}))}:function(e){return new Promise(((t,n)=>{const r=new XMLHttpRequest;r.open("GET",e,!0),r.withCredentials=!0,r.onload=()=>{200===r.status?t():n()},r.send(null)}))};var Y=n(99670);const Q=new Set,X=new Set,J=()=>navigator.connection?.effectiveType.includes("2g")||navigator.connection?.saveData,ee={prefetch(e){if(!(e=>!J()&&!X.has(e)&&!Q.has(e))(e))return!1;Q.add(e);const t=(0,d.f)(c.Z,e).flatMap((e=>{return t=e.route.path,Object.entries(q).filter((e=>{let[n]=e;return n.replace(/-[^-]+$/,"")===t})).flatMap((e=>{let[,t]=e;return Object.values((0,Y.Z)(t))}));var t}));return Promise.all(t.map((e=>{const t=n.gca(e);return t&&!t.includes("undefined")?K(t).catch((()=>{})):Promise.resolve()})))},preload:e=>!!(e=>!J()&&!X.has(e))(e)&&(X.add(e),R(e))},te=Object.freeze(ee),ne=Boolean(!0);if(s.Z.canUseDOM){window.docusaurus=te;const e=document.getElementById("__docusaurus"),t=(0,p.jsx)(i.B6,{children:(0,p.jsx)(o.VK,{children:(0,p.jsx)(G,{})})}),n=(e,t)=>{console.error("Docusaurus React Root onRecoverableError:",e,t)},s=()=>{if(ne)r.startTransition((()=>{a.hydrateRoot(e,t,{onRecoverableError:n})}));else{const o=a.createRoot(e,{onRecoverableError:n});r.startTransition((()=>{o.render(t)}))}};R(window.location.pathname).then(s)}},58940:(e,t,n)=>{"use strict";n.d(t,{_:()=>d,M:()=>p});var r=n(67294),a=n(36809);const o=JSON.parse('{"docusaurus-plugin-content-docs":{"cpp":{"path":"/cpp","versions":[{"name":"current","label":"Next","isLast":true,"path":"/cpp","mainDocId":"cpp-intro","docs":[{"id":"cpp-intro","path":"/cpp/","sidebar":"autogeneratedBar"},{"id":"environment","path":"/cpp/environment","sidebar":"autogeneratedBar"},{"id":"exceptions-and-raii/2023-11-24-placeholders","path":"/cpp/exceptions-and-raii/placeholders","sidebar":"autogeneratedBar"},{"id":"/category/exceptions-and-raii","path":"/cpp/category/exceptions-and-raii","sidebar":"autogeneratedBar"}],"draftIds":[],"sidebars":{"autogeneratedBar":{"link":{"path":"/cpp/","label":"cpp-intro"}}}}],"breadcrumbs":true},"algorithms":{"path":"/algorithms","versions":[{"name":"current","label":"Next","isLast":true,"path":"/algorithms","mainDocId":"algorithms-intro","docs":[{"id":"algorithms-correctness/postcondition-ambiguity","path":"/algorithms/algorithms-correctness/postcondition-ambiguity","sidebar":"autogeneratedBar"},{"id":"algorithms-intro","path":"/algorithms/","sidebar":"autogeneratedBar"},{"id":"graphs/bfs-tree","path":"/algorithms/graphs/bfs-tree","sidebar":"autogeneratedBar"},{"id":"graphs/iterative-and-iterators","path":"/algorithms/graphs/iterative-and-iterators","sidebar":"autogeneratedBar"},{"id":"hash-tables/2023-11-28-breaking/breaking","path":"/algorithms/hash-tables/breaking","sidebar":"autogeneratedBar"},{"id":"hash-tables/2023-11-28-breaking/mitigations","path":"/algorithms/hash-tables/breaking/mitigations","sidebar":"autogeneratedBar"},{"id":"hash-tables/2023-11-28-breaking/python","path":"/algorithms/hash-tables/breaking/python","sidebar":"autogeneratedBar"},{"id":"paths/2024-01-01-bf-to-astar/astar","path":"/algorithms/paths/bf-to-astar/astar","sidebar":"autogeneratedBar"},{"id":"paths/2024-01-01-bf-to-astar/bf","path":"/algorithms/paths/bf-to-astar/bf","sidebar":"autogeneratedBar"},{"id":"paths/2024-01-01-bf-to-astar/dijkstra","path":"/algorithms/paths/bf-to-astar/dijkstra","sidebar":"autogeneratedBar"},{"id":"paths/2024-01-01-bf-to-astar/index","path":"/algorithms/paths/bf-to-astar","sidebar":"autogeneratedBar"},{"id":"rb-trees/applications","path":"/algorithms/rb-trees/applications","sidebar":"autogeneratedBar"},{"id":"rb-trees/rules","path":"/algorithms/rb-trees/rules","sidebar":"autogeneratedBar"},{"id":"recursion/2022-11-29-karel/karel","path":"/algorithms/recursion/karel","sidebar":"autogeneratedBar"},{"id":"recursion/2022-11-29-karel/solution","path":"/algorithms/recursion/karel/solution","sidebar":"autogeneratedBar"},{"id":"recursion/2023-08-17-pyramid-slide-down/bottom-up-dp","path":"/algorithms/recursion/pyramid-slide-down/bottom-up-dp","sidebar":"autogeneratedBar"},{"id":"recursion/2023-08-17-pyramid-slide-down/greedy","path":"/algorithms/recursion/pyramid-slide-down/greedy","sidebar":"autogeneratedBar"},{"id":"recursion/2023-08-17-pyramid-slide-down/naive","path":"/algorithms/recursion/pyramid-slide-down/naive","sidebar":"autogeneratedBar"},{"id":"recursion/2023-08-17-pyramid-slide-down/pyramid-slide-down","path":"/algorithms/recursion/pyramid-slide-down","sidebar":"autogeneratedBar"},{"id":"recursion/2023-08-17-pyramid-slide-down/top-down-dp","path":"/algorithms/recursion/pyramid-slide-down/top-down-dp","sidebar":"autogeneratedBar"},{"id":"time-complexity/extend","path":"/algorithms/time-complexity/extend","sidebar":"autogeneratedBar"},{"id":"/category/algorithms-and-correctness","path":"/algorithms/category/algorithms-and-correctness","sidebar":"autogeneratedBar"},{"id":"/category/asymptotic-notation-and-time-complexity","path":"/algorithms/category/asymptotic-notation-and-time-complexity","sidebar":"autogeneratedBar"},{"id":"/category/recursion","path":"/algorithms/category/recursion","sidebar":"autogeneratedBar"},{"id":"/category/red-black-trees","path":"/algorithms/category/red-black-trees","sidebar":"autogeneratedBar"},{"id":"/category/graphs","path":"/algorithms/category/graphs","sidebar":"autogeneratedBar"},{"id":"/category/paths-in-graphs","path":"/algorithms/category/paths-in-graphs","sidebar":"autogeneratedBar"},{"id":"/category/hash-tables","path":"/algorithms/category/hash-tables","sidebar":"autogeneratedBar"}],"draftIds":[],"sidebars":{"autogeneratedBar":{"link":{"path":"/algorithms/","label":"algorithms-intro"}}}}],"breadcrumbs":true},"c":{"path":"/c","versions":[{"name":"current","label":"Next","isLast":true,"path":"/c","mainDocId":"c-intro","docs":[{"id":"bonuses/seminar-03","path":"/c/bonuses/seminar-03","sidebar":"autogeneratedBar"},{"id":"bonuses/seminar-04","path":"/c/bonuses/seminar-04","sidebar":"autogeneratedBar"},{"id":"bonuses/seminar-05-06","path":"/c/bonuses/seminar-05-06","sidebar":"autogeneratedBar"},{"id":"bonuses/seminar-08","path":"/c/bonuses/seminar-08","sidebar":"autogeneratedBar"},{"id":"bonuses/seminar-10","path":"/c/bonuses/seminar-10","sidebar":"autogeneratedBar"},{"id":"c-intro","path":"/c/","sidebar":"autogeneratedBar"},{"id":"mr","path":"/c/mr","sidebar":"autogeneratedBar"},{"id":"pexam/b-garbage_collect","path":"/c/pexam/garbage_collect","sidebar":"autogeneratedBar"},{"id":"pexam/c-cams","path":"/c/pexam/cams","sidebar":"autogeneratedBar"},{"id":"/category/bonuses","path":"/c/category/bonuses","sidebar":"autogeneratedBar"},{"id":"/category/practice-exams","path":"/c/category/practice-exams","sidebar":"autogeneratedBar"}],"draftIds":[],"sidebars":{"autogeneratedBar":{"link":{"path":"/c/","label":"c-intro"}}}}],"breadcrumbs":true}}}'),i=JSON.parse('{"defaultLocale":"en","locales":["en"],"path":"i18n","currentLocale":"en","localeConfigs":{"en":{"label":"English","direction":"ltr","htmlLang":"en","calendar":"gregory","path":"en"}}}');var s=n(57529);const l=JSON.parse('{"docusaurusVersion":"3.1.1","siteVersion":"0.0.0","pluginVersions":{"docusaurus-plugin-content-pages":{"type":"package","name":"@docusaurus/plugin-content-pages","version":"3.1.1"},"docusaurus-plugin-sitemap":{"type":"package","name":"@docusaurus/plugin-sitemap","version":"3.1.1"},"docusaurus-theme-classic":{"type":"package","name":"@docusaurus/theme-classic","version":"3.1.1"},"docusaurus-theme-search-algolia":{"type":"package","name":"@docusaurus/theme-search-algolia","version":"3.1.1"},"docusaurus-plugin-content-docs":{"type":"package","name":"@docusaurus/plugin-content-docs","version":"3.1.1"},"docusaurus-plugin-content-blog":{"type":"package","name":"@docusaurus/plugin-content-blog","version":"3.1.1"},"docusaurus-plugin-sass":{"type":"package","name":"docusaurus-plugin-sass","version":"0.2.5"},"docusaurus-plugin-client-redirects":{"type":"package","name":"@docusaurus/plugin-client-redirects","version":"3.1.1"},"docusaurus-theme-mermaid":{"type":"package","name":"@docusaurus/theme-mermaid","version":"3.1.1"}}}');var c=n(85893);const u={siteConfig:a.default,siteMetadata:l,globalData:o,i18n:i,codeTranslations:s},d=r.createContext(u);function p(e){let{children:t}=e;return(0,c.jsx)(d.Provider,{value:u,children:t})}},44763:(e,t,n)=>{"use strict";n.d(t,{Z:()=>f});var r=n(67294),a=n(10412),o=n(35742),i=n(18780),s=n(80647),l=n(85893);function c(e){let{error:t,tryAgain:n}=e;return(0,l.jsxs)("div",{style:{display:"flex",flexDirection:"column",justifyContent:"center",alignItems:"flex-start",minHeight:"100vh",width:"100%",maxWidth:"80ch",fontSize:"20px",margin:"0 auto",padding:"1rem"},children:[(0,l.jsx)("h1",{style:{fontSize:"3rem"},children:"This page crashed"}),(0,l.jsx)("button",{type:"button",onClick:n,style:{margin:"1rem 0",fontSize:"2rem",cursor:"pointer",borderRadius:20,padding:"1rem"},children:"Try again"}),(0,l.jsx)(u,{error:t})]})}function u(e){let{error:t}=e;const n=(0,i.getErrorCausalChain)(t).map((e=>e.message)).join("\n\nCause:\n");return(0,l.jsx)("p",{style:{whiteSpace:"pre-wrap"},children:n})}function d(e){let{error:t,tryAgain:n}=e;return(0,l.jsxs)(f,{fallback:()=>(0,l.jsx)(c,{error:t,tryAgain:n}),children:[(0,l.jsx)(o.Z,{children:(0,l.jsx)("title",{children:"Page Error"})}),(0,l.jsx)(s.Z,{children:(0,l.jsx)(c,{error:t,tryAgain:n})})]})}const p=e=>(0,l.jsx)(d,{...e});class f extends r.Component{constructor(e){super(e),this.state={error:null}}componentDidCatch(e){a.Z.canUseDOM&&this.setState({error:e})}render(){const{children:e}=this.props,{error:t}=this.state;if(t){const e={error:t,tryAgain:()=>this.setState({error:null})};return(this.props.fallback??p)(e)}return e??null}}},10412:(e,t,n)=>{"use strict";n.d(t,{Z:()=>a});const r="undefined"!=typeof window&&"document"in window&&"createElement"in window.document,a={canUseDOM:r,canUseEventListeners:r&&("addEventListener"in window||"attachEvent"in window),canUseIntersectionObserver:r&&"IntersectionObserver"in window,canUseViewport:r&&"screen"in window}},35742:(e,t,n)=>{"use strict";n.d(t,{Z:()=>o});n(67294);var r=n(70405),a=n(85893);function o(e){return(0,a.jsx)(r.ql,{...e})}},33692:(e,t,n)=>{"use strict";n.d(t,{Z:()=>f});var r=n(67294),a=n(73727),o=n(18780),i=n(52263),s=n(13919),l=n(10412),c=n(28138),u=n(44996),d=n(85893);function p(e,t){let{isNavLink:n,to:p,href:f,activeClassName:g,isActive:h,"data-noBrokenLinkCheck":m,autoAddBaseUrl:b=!0,...y}=e;const{siteConfig:{trailingSlash:v,baseUrl:w}}=(0,i.Z)(),{withBaseUrl:k}=(0,u.C)(),x=(0,c.Z)(),S=(0,r.useRef)(null);(0,r.useImperativeHandle)(t,(()=>S.current));const _=p||f;const E=(0,s.Z)(_),C=_?.replace("pathname://","");let T=void 0!==C?(A=C,b&&(e=>e.startsWith("/"))(A)?k(A):A):void 0;var A;T&&E&&(T=(0,o.applyTrailingSlash)(T,{trailingSlash:v,baseUrl:w}));const j=(0,r.useRef)(!1),N=n?a.OL:a.rU,L=l.Z.canUseIntersectionObserver,P=(0,r.useRef)(),I=()=>{j.current||null==T||(window.docusaurus.preload(T),j.current=!0)};(0,r.useEffect)((()=>(!L&&E&&null!=T&&window.docusaurus.prefetch(T),()=>{L&&P.current&&P.current.disconnect()})),[P,T,L,E]);const R=T?.startsWith("#")??!1,O=!y.target||"_self"===y.target,F=!T||!E||!O||R;return m||!R&&F||x.collectLink(T),y.id&&x.collectAnchor(y.id),F?(0,d.jsx)("a",{ref:S,href:T,..._&&!E&&{target:"_blank",rel:"noopener noreferrer"},...y}):(0,d.jsx)(N,{...y,onMouseEnter:I,onTouchStart:I,innerRef:e=>{S.current=e,L&&e&&E&&(P.current=new window.IntersectionObserver((t=>{t.forEach((t=>{e===t.target&&(t.isIntersecting||t.intersectionRatio>0)&&(P.current.unobserve(e),P.current.disconnect(),null!=T&&window.docusaurus.prefetch(T))}))})),P.current.observe(e))},to:T,...n&&{isActive:h,activeClassName:g}})}const f=r.forwardRef(p)},95999:(e,t,n)=>{"use strict";n.d(t,{Z:()=>c,I:()=>l});var r=n(67294),a=n(85893);function o(e,t){const n=e.split(/(\{\w+\})/).map(((e,n)=>{if(n%2==1){const n=t?.[e.slice(1,-1)];if(void 0!==n)return n}return e}));return n.some((e=>(0,r.isValidElement)(e)))?n.map(((e,t)=>(0,r.isValidElement)(e)?r.cloneElement(e,{key:t}):e)).filter((e=>""!==e)):n.join("")}var i=n(57529);function s(e){let{id:t,message:n}=e;if(void 0===t&&void 0===n)throw new Error("Docusaurus translation declarations must have at least a translation id or a default translation message");return i[t??n]??n??t}function l(e,t){let{message:n,id:r}=e;return o(s({message:n,id:r}),t)}function c(e){let{children:t,id:n,values:r}=e;if(t&&"string"!=typeof t)throw console.warn("Illegal children",t),new Error("The Docusaurus component only accept simple string values");const i=s({message:t,id:n});return(0,a.jsx)(a.Fragment,{children:o(i,r)})}},29935:(e,t,n)=>{"use strict";n.d(t,{m:()=>r});const r="default"},13919:(e,t,n)=>{"use strict";function r(e){return/^(?:\w*:|\/\/)/.test(e)}function a(e){return void 0!==e&&!r(e)}n.d(t,{Z:()=>a,b:()=>r})},44996:(e,t,n)=>{"use strict";n.d(t,{C:()=>i,Z:()=>s});var r=n(67294),a=n(52263),o=n(13919);function i(){const{siteConfig:{baseUrl:e,url:t}}=(0,a.Z)(),n=(0,r.useCallback)(((n,r)=>function(e,t,n,r){let{forcePrependBaseUrl:a=!1,absolute:i=!1}=void 0===r?{}:r;if(!n||n.startsWith("#")||(0,o.b)(n))return n;if(a)return t+n.replace(/^\//,"");if(n===t.replace(/\/$/,""))return t;const s=n.startsWith(t)?n:t+n.replace(/^\//,"");return i?e+s:s}(t,e,n,r)),[t,e]);return{withBaseUrl:n}}function s(e,t){void 0===t&&(t={});const{withBaseUrl:n}=i();return n(e,t)}},28138:(e,t,n)=>{"use strict";n.d(t,{Z:()=>i});var r=n(67294);n(85893);const a=r.createContext({collectAnchor:()=>{},collectLink:()=>{}}),o=()=>(0,r.useContext)(a);function i(){return o()}},52263:(e,t,n)=>{"use strict";n.d(t,{Z:()=>o});var r=n(67294),a=n(58940);function o(){return(0,r.useContext)(a._)}},72389:(e,t,n)=>{"use strict";n.d(t,{Z:()=>o});var r=n(67294),a=n(98934);function o(){return(0,r.useContext)(a._)}},20469:(e,t,n)=>{"use strict";n.d(t,{Z:()=>a});var r=n(67294);const a=n(10412).Z.canUseDOM?r.useLayoutEffect:r.useEffect},99670:(e,t,n)=>{"use strict";n.d(t,{Z:()=>a});const r=e=>"object"==typeof e&&!!e&&Object.keys(e).length>0;function a(e){const t={};return function e(n,a){Object.entries(n).forEach((n=>{let[o,i]=n;const s=a?`${a}.${o}`:o;r(i)?e(i,s):t[s]=i}))}(e),t}},30226:(e,t,n)=>{"use strict";n.d(t,{_:()=>o,z:()=>i});var r=n(67294),a=n(85893);const o=r.createContext(null);function i(e){let{children:t,value:n}=e;const i=r.useContext(o),s=(0,r.useMemo)((()=>function(e){let{parent:t,value:n}=e;if(!t){if(!n)throw new Error("Unexpected: no Docusaurus route context found");if(!("plugin"in n))throw new Error("Unexpected: Docusaurus topmost route context has no `plugin` attribute");return n}const r={...t.data,...n?.data};return{plugin:t.plugin,data:r}}({parent:i,value:n})),[i,n]);return(0,a.jsx)(o.Provider,{value:s,children:t})}},80143:(e,t,n)=>{"use strict";n.d(t,{Iw:()=>b,gA:()=>f,WS:()=>g,_r:()=>d,Jo:()=>y,zh:()=>p,yW:()=>m,gB:()=>h});var r=n(16550),a=n(52263),o=n(29935);function i(e,t){void 0===t&&(t={});const n=function(){const{globalData:e}=(0,a.Z)();return e}()[e];if(!n&&t.failfast)throw new Error(`Docusaurus plugin global data not found for "${e}" plugin.`);return n}const s=e=>e.versions.find((e=>e.isLast));function l(e,t){const n=s(e);return[...e.versions.filter((e=>e!==n)),n].find((e=>!!(0,r.LX)(t,{path:e.path,exact:!1,strict:!1})))}function c(e,t){const n=l(e,t),a=n?.docs.find((e=>!!(0,r.LX)(t,{path:e.path,exact:!0,strict:!1})));return{activeVersion:n,activeDoc:a,alternateDocVersions:a?function(t){const n={};return e.versions.forEach((e=>{e.docs.forEach((r=>{r.id===t&&(n[e.name]=r)}))})),n}(a.id):{}}}const u={},d=()=>i("docusaurus-plugin-content-docs")??u,p=e=>function(e,t,n){void 0===t&&(t=o.m),void 0===n&&(n={});const r=i(e),a=r?.[t];if(!a&&n.failfast)throw new Error(`Docusaurus plugin global data not found for "${e}" plugin with id "${t}".`);return a}("docusaurus-plugin-content-docs",e,{failfast:!0});function f(e){void 0===e&&(e={});const t=d(),{pathname:n}=(0,r.TH)();return function(e,t,n){void 0===n&&(n={});const a=Object.entries(e).sort(((e,t)=>t[1].path.localeCompare(e[1].path))).find((e=>{let[,n]=e;return!!(0,r.LX)(t,{path:n.path,exact:!1,strict:!1})})),o=a?{pluginId:a[0],pluginData:a[1]}:void 0;if(!o&&n.failfast)throw new Error(`Can't find active docs plugin for "${t}" pathname, while it was expected to be found. Maybe you tried to use a docs feature that can only be used on a docs-related page? Existing docs plugin paths are: ${Object.values(e).map((e=>e.path)).join(", ")}`);return o}(t,n,e)}function g(e){void 0===e&&(e={});const t=f(e),{pathname:n}=(0,r.TH)();if(!t)return;return{activePlugin:t,activeVersion:l(t.pluginData,n)}}function h(e){return p(e).versions}function m(e){const t=p(e);return s(t)}function b(e){const t=p(e),{pathname:n}=(0,r.TH)();return c(t,n)}function y(e){const t=p(e),{pathname:n}=(0,r.TH)();return function(e,t){const n=s(e);return{latestDocSuggestion:c(e,t).alternateDocVersions[n.name],latestVersionSuggestion:n}}(t,n)}},18320:(e,t,n)=>{"use strict";n.r(t),n.d(t,{default:()=>o});var r=n(74865),a=n.n(r);a().configure({showSpinner:!1});const o={onRouteUpdate(e){let{location:t,previousLocation:n}=e;if(n&&t.pathname!==n.pathname){const e=window.setTimeout((()=>{a().start()}),200);return()=>window.clearTimeout(e)}},onRouteDidUpdate(){a().done()}}},3310:(e,t,n)=>{"use strict";n.r(t);var r=n(14965),a=n(36809);!function(e){const{themeConfig:{prism:t}}=a.default,{additionalLanguages:r}=t;globalThis.Prism=e,r.forEach((e=>{"php"===e&&n(96854),n(30218)(`./prism-${e}`)})),delete globalThis.Prism}(r.p1)},92503:(e,t,n)=>{"use strict";n.d(t,{Z:()=>u});n(67294);var r=n(36905),a=n(95999),o=n(86668),i=n(33692),s=n(28138);const l={anchorWithStickyNavbar:"anchorWithStickyNavbar_LWe7",anchorWithHideOnScrollNavbar:"anchorWithHideOnScrollNavbar_WYt5"};var c=n(85893);function u(e){let{as:t,id:n,...u}=e;const d=(0,s.Z)(),{navbar:{hideOnScroll:p}}=(0,o.L)();if("h1"===t||!n)return(0,c.jsx)(t,{...u,id:void 0});d.collectAnchor(n);const f=(0,a.I)({id:"theme.common.headingLinkTitle",message:"Direct link to {heading}",description:"Title for link to heading"},{heading:"string"==typeof u.children?u.children:n});return(0,c.jsxs)(t,{...u,className:(0,r.Z)("anchor",p?l.anchorWithHideOnScrollNavbar:l.anchorWithStickyNavbar,u.className),id:n,children:[u.children,(0,c.jsx)(i.Z,{className:"hash-link",to:`#${n}`,"aria-label":f,title:f,children:"\u200b"})]})}},39471:(e,t,n)=>{"use strict";n.d(t,{Z:()=>o});n(67294);const r={iconExternalLink:"iconExternalLink_nPIU"};var a=n(85893);function o(e){let{width:t=13.5,height:n=13.5}=e;return(0,a.jsx)("svg",{width:t,height:n,"aria-hidden":"true",viewBox:"0 0 24 24",className:r.iconExternalLink,children:(0,a.jsx)("path",{fill:"currentColor",d:"M21 13v10h-21v-19h12v2h-10v15h17v-8h2zm3-12h-10.988l4.035 4-6.977 7.07 2.828 2.828 6.977-7.07 4.125 4.172v-11z"})})}},80647:(e,t,n)=>{"use strict";n.d(t,{Z:()=>At});var r=n(67294),a=n(36905),o=n(44763),i=n(10833),s=n(16550),l=n(95999),c=n(85936),u=n(85893);const d="__docusaurus_skipToContent_fallback";function p(e){e.setAttribute("tabindex","-1"),e.focus(),e.removeAttribute("tabindex")}function f(){const e=(0,r.useRef)(null),{action:t}=(0,s.k6)(),n=(0,r.useCallback)((e=>{e.preventDefault();const t=document.querySelector("main:first-of-type")??document.getElementById(d);t&&p(t)}),[]);return(0,c.S)((n=>{let{location:r}=n;e.current&&!r.hash&&"PUSH"===t&&p(e.current)})),{containerRef:e,onClick:n}}const g=(0,l.I)({id:"theme.common.skipToMainContent",description:"The skip to content label used for accessibility, allowing to rapidly navigate to main content with keyboard tab/enter navigation",message:"Skip to main content"});function h(e){const t=e.children??g,{containerRef:n,onClick:r}=f();return(0,u.jsx)("div",{ref:n,role:"region","aria-label":g,children:(0,u.jsx)("a",{...e,href:`#${d}`,onClick:r,children:t})})}var m=n(35281),b=n(19727);const y={skipToContent:"skipToContent_fXgn"};function v(){return(0,u.jsx)(h,{className:y.skipToContent})}var w=n(86668),k=n(59689);function x(e){let{width:t=21,height:n=21,color:r="currentColor",strokeWidth:a=1.2,className:o,...i}=e;return(0,u.jsx)("svg",{viewBox:"0 0 15 15",width:t,height:n,...i,children:(0,u.jsx)("g",{stroke:r,strokeWidth:a,children:(0,u.jsx)("path",{d:"M.75.75l13.5 13.5M14.25.75L.75 14.25"})})})}const S={closeButton:"closeButton_CVFx"};function _(e){return(0,u.jsx)("button",{type:"button","aria-label":(0,l.I)({id:"theme.AnnouncementBar.closeButtonAriaLabel",message:"Close",description:"The ARIA label for close button of announcement bar"}),...e,className:(0,a.Z)("clean-btn close",S.closeButton,e.className),children:(0,u.jsx)(x,{width:14,height:14,strokeWidth:3.1})})}const E={content:"content_knG7"};function C(e){const{announcementBar:t}=(0,w.L)(),{content:n}=t;return(0,u.jsx)("div",{...e,className:(0,a.Z)(E.content,e.className),dangerouslySetInnerHTML:{__html:n}})}const T={announcementBar:"announcementBar_mb4j",announcementBarPlaceholder:"announcementBarPlaceholder_vyr4",announcementBarClose:"announcementBarClose_gvF7",announcementBarContent:"announcementBarContent_xLdY"};function A(){const{announcementBar:e}=(0,w.L)(),{isActive:t,close:n}=(0,k.nT)();if(!t)return null;const{backgroundColor:r,textColor:a,isCloseable:o}=e;return(0,u.jsxs)("div",{className:T.announcementBar,style:{backgroundColor:r,color:a},role:"banner",children:[o&&(0,u.jsx)("div",{className:T.announcementBarPlaceholder}),(0,u.jsx)(C,{className:T.announcementBarContent}),o&&(0,u.jsx)(_,{onClick:n,className:T.announcementBarClose})]})}var j=n(93163),N=n(12466);var L=n(902),P=n(13102);const I=r.createContext(null);function R(e){let{children:t}=e;const n=function(){const e=(0,j.e)(),t=(0,P.HY)(),[n,a]=(0,r.useState)(!1),o=null!==t.component,i=(0,L.D9)(o);return(0,r.useEffect)((()=>{o&&!i&&a(!0)}),[o,i]),(0,r.useEffect)((()=>{o?e.shown||a(!0):a(!1)}),[e.shown,o]),(0,r.useMemo)((()=>[n,a]),[n])}();return(0,u.jsx)(I.Provider,{value:n,children:t})}function O(e){if(e.component){const t=e.component;return(0,u.jsx)(t,{...e.props})}}function F(){const e=(0,r.useContext)(I);if(!e)throw new L.i6("NavbarSecondaryMenuDisplayProvider");const[t,n]=e,a=(0,r.useCallback)((()=>n(!1)),[n]),o=(0,P.HY)();return(0,r.useMemo)((()=>({shown:t,hide:a,content:O(o)})),[a,o,t])}function M(e){let{header:t,primaryMenu:n,secondaryMenu:r}=e;const{shown:o}=F();return(0,u.jsxs)("div",{className:"navbar-sidebar",children:[t,(0,u.jsxs)("div",{className:(0,a.Z)("navbar-sidebar__items",{"navbar-sidebar__items--show-secondary":o}),children:[(0,u.jsx)("div",{className:"navbar-sidebar__item menu",children:n}),(0,u.jsx)("div",{className:"navbar-sidebar__item menu",children:r})]})]})}var D=n(92949),B=n(72389);function z(e){return(0,u.jsx)("svg",{viewBox:"0 0 24 24",width:24,height:24,...e,children:(0,u.jsx)("path",{fill:"currentColor",d:"M12,9c1.65,0,3,1.35,3,3s-1.35,3-3,3s-3-1.35-3-3S10.35,9,12,9 M12,7c-2.76,0-5,2.24-5,5s2.24,5,5,5s5-2.24,5-5 S14.76,7,12,7L12,7z M2,13l2,0c0.55,0,1-0.45,1-1s-0.45-1-1-1l-2,0c-0.55,0-1,0.45-1,1S1.45,13,2,13z M20,13l2,0c0.55,0,1-0.45,1-1 s-0.45-1-1-1l-2,0c-0.55,0-1,0.45-1,1S19.45,13,20,13z M11,2v2c0,0.55,0.45,1,1,1s1-0.45,1-1V2c0-0.55-0.45-1-1-1S11,1.45,11,2z M11,20v2c0,0.55,0.45,1,1,1s1-0.45,1-1v-2c0-0.55-0.45-1-1-1C11.45,19,11,19.45,11,20z M5.99,4.58c-0.39-0.39-1.03-0.39-1.41,0 c-0.39,0.39-0.39,1.03,0,1.41l1.06,1.06c0.39,0.39,1.03,0.39,1.41,0s0.39-1.03,0-1.41L5.99,4.58z M18.36,16.95 c-0.39-0.39-1.03-0.39-1.41,0c-0.39,0.39-0.39,1.03,0,1.41l1.06,1.06c0.39,0.39,1.03,0.39,1.41,0c0.39-0.39,0.39-1.03,0-1.41 L18.36,16.95z M19.42,5.99c0.39-0.39,0.39-1.03,0-1.41c-0.39-0.39-1.03-0.39-1.41,0l-1.06,1.06c-0.39,0.39-0.39,1.03,0,1.41 s1.03,0.39,1.41,0L19.42,5.99z M7.05,18.36c0.39-0.39,0.39-1.03,0-1.41c-0.39-0.39-1.03-0.39-1.41,0l-1.06,1.06 c-0.39,0.39-0.39,1.03,0,1.41s1.03,0.39,1.41,0L7.05,18.36z"})})}function $(e){return(0,u.jsx)("svg",{viewBox:"0 0 24 24",width:24,height:24,...e,children:(0,u.jsx)("path",{fill:"currentColor",d:"M9.37,5.51C9.19,6.15,9.1,6.82,9.1,7.5c0,4.08,3.32,7.4,7.4,7.4c0.68,0,1.35-0.09,1.99-0.27C17.45,17.19,14.93,19,12,19 c-3.86,0-7-3.14-7-7C5,9.07,6.81,6.55,9.37,5.51z M12,3c-4.97,0-9,4.03-9,9s4.03,9,9,9s9-4.03,9-9c0-0.46-0.04-0.92-0.1-1.36 c-0.98,1.37-2.58,2.26-4.4,2.26c-2.98,0-5.4-2.42-5.4-5.4c0-1.81,0.89-3.42,2.26-4.4C12.92,3.04,12.46,3,12,3L12,3z"})})}const U={toggle:"toggle_vylO",toggleButton:"toggleButton_gllP",darkToggleIcon:"darkToggleIcon_wfgR",lightToggleIcon:"lightToggleIcon_pyhR",toggleButtonDisabled:"toggleButtonDisabled_aARS"};function Z(e){let{className:t,buttonClassName:n,value:r,onChange:o}=e;const i=(0,B.Z)(),s=(0,l.I)({message:"Switch between dark and light mode (currently {mode})",id:"theme.colorToggle.ariaLabel",description:"The ARIA label for the navbar color mode toggle"},{mode:"dark"===r?(0,l.I)({message:"dark mode",id:"theme.colorToggle.ariaLabel.mode.dark",description:"The name for the dark color mode"}):(0,l.I)({message:"light mode",id:"theme.colorToggle.ariaLabel.mode.light",description:"The name for the light color mode"})});return(0,u.jsx)("div",{className:(0,a.Z)(U.toggle,t),children:(0,u.jsxs)("button",{className:(0,a.Z)("clean-btn",U.toggleButton,!i&&U.toggleButtonDisabled,n),type:"button",onClick:()=>o("dark"===r?"light":"dark"),disabled:!i,title:s,"aria-label":s,"aria-live":"polite",children:[(0,u.jsx)(z,{className:(0,a.Z)(U.toggleIcon,U.lightToggleIcon)}),(0,u.jsx)($,{className:(0,a.Z)(U.toggleIcon,U.darkToggleIcon)})]})})}const H=r.memo(Z),V={darkNavbarColorModeToggle:"darkNavbarColorModeToggle_X3D1"};function W(e){let{className:t}=e;const n=(0,w.L)().navbar.style,r=(0,w.L)().colorMode.disableSwitch,{colorMode:a,setColorMode:o}=(0,D.I)();return r?null:(0,u.jsx)(H,{className:t,buttonClassName:"dark"===n?V.darkNavbarColorModeToggle:void 0,value:a,onChange:o})}var G=n(21327);function q(){return(0,u.jsx)(G.Z,{className:"navbar__brand",imageClassName:"navbar__logo",titleClassName:"navbar__title text--truncate"})}function K(){const e=(0,j.e)();return(0,u.jsx)("button",{type:"button","aria-label":(0,l.I)({id:"theme.docs.sidebar.closeSidebarButtonAriaLabel",message:"Close navigation bar",description:"The ARIA label for close button of mobile sidebar"}),className:"clean-btn navbar-sidebar__close",onClick:()=>e.toggle(),children:(0,u.jsx)(x,{color:"var(--ifm-color-emphasis-600)"})})}function Y(){return(0,u.jsxs)("div",{className:"navbar-sidebar__brand",children:[(0,u.jsx)(q,{}),(0,u.jsx)(W,{className:"margin-right--md"}),(0,u.jsx)(K,{})]})}var Q=n(33692),X=n(44996),J=n(13919),ee=n(98022),te=n(39471);function ne(e){let{activeBasePath:t,activeBaseRegex:n,to:r,href:a,label:o,html:i,isDropdownLink:s,prependBaseUrlToHref:l,...c}=e;const d=(0,X.Z)(r),p=(0,X.Z)(t),f=(0,X.Z)(a,{forcePrependBaseUrl:!0}),g=o&&a&&!(0,J.Z)(a),h=i?{dangerouslySetInnerHTML:{__html:i}}:{children:(0,u.jsxs)(u.Fragment,{children:[o,g&&(0,u.jsx)(te.Z,{...s&&{width:12,height:12}})]})};return a?(0,u.jsx)(Q.Z,{href:l?f:a,...c,...h}):(0,u.jsx)(Q.Z,{to:d,isNavLink:!0,...(t||n)&&{isActive:(e,t)=>n?(0,ee.F)(n,t.pathname):t.pathname.startsWith(p)},...c,...h})}function re(e){let{className:t,isDropdownItem:n=!1,...r}=e;const o=(0,u.jsx)(ne,{className:(0,a.Z)(n?"dropdown__link":"navbar__item navbar__link",t),isDropdownLink:n,...r});return n?(0,u.jsx)("li",{children:o}):o}function ae(e){let{className:t,isDropdownItem:n,...r}=e;return(0,u.jsx)("li",{className:"menu__list-item",children:(0,u.jsx)(ne,{className:(0,a.Z)("menu__link",t),...r})})}function oe(e){let{mobile:t=!1,position:n,...r}=e;const a=t?ae:re;return(0,u.jsx)(a,{...r,activeClassName:r.activeClassName??(t?"menu__link--active":"navbar__link--active")})}var ie=n(86043),se=n(48596),le=n(52263);const ce={dropdownNavbarItemMobile:"dropdownNavbarItemMobile_S0Fm"};function ue(e,t){return e.some((e=>function(e,t){return!!(0,se.Mg)(e.to,t)||!!(0,ee.F)(e.activeBaseRegex,t)||!(!e.activeBasePath||!t.startsWith(e.activeBasePath))}(e,t)))}function de(e){let{items:t,position:n,className:o,onClick:i,...s}=e;const l=(0,r.useRef)(null),[c,d]=(0,r.useState)(!1);return(0,r.useEffect)((()=>{const e=e=>{l.current&&!l.current.contains(e.target)&&d(!1)};return document.addEventListener("mousedown",e),document.addEventListener("touchstart",e),document.addEventListener("focusin",e),()=>{document.removeEventListener("mousedown",e),document.removeEventListener("touchstart",e),document.removeEventListener("focusin",e)}}),[l]),(0,u.jsxs)("div",{ref:l,className:(0,a.Z)("navbar__item","dropdown","dropdown--hoverable",{"dropdown--right":"right"===n,"dropdown--show":c}),children:[(0,u.jsx)(ne,{"aria-haspopup":"true","aria-expanded":c,role:"button",href:s.to?void 0:"#",className:(0,a.Z)("navbar__link",o),...s,onClick:s.to?void 0:e=>e.preventDefault(),onKeyDown:e=>{"Enter"===e.key&&(e.preventDefault(),d(!c))},children:s.children??s.label}),(0,u.jsx)("ul",{className:"dropdown__menu",children:t.map(((e,t)=>(0,r.createElement)(He,{isDropdownItem:!0,activeClassName:"dropdown__link--active",...e,key:t})))})]})}function pe(e){let{items:t,className:n,position:o,onClick:i,...l}=e;const c=function(){const{siteConfig:{baseUrl:e}}=(0,le.Z)(),{pathname:t}=(0,s.TH)();return t.replace(e,"/")}(),d=ue(t,c),{collapsed:p,toggleCollapsed:f,setCollapsed:g}=(0,ie.u)({initialState:()=>!d});return(0,r.useEffect)((()=>{d&&g(!d)}),[c,d,g]),(0,u.jsxs)("li",{className:(0,a.Z)("menu__list-item",{"menu__list-item--collapsed":p}),children:[(0,u.jsx)(ne,{role:"button",className:(0,a.Z)(ce.dropdownNavbarItemMobile,"menu__link menu__link--sublist menu__link--sublist-caret",n),...l,onClick:e=>{e.preventDefault(),f()},children:l.children??l.label}),(0,u.jsx)(ie.z,{lazy:!0,as:"ul",className:"menu__list",collapsed:p,children:t.map(((e,t)=>(0,r.createElement)(He,{mobile:!0,isDropdownItem:!0,onClick:i,activeClassName:"menu__link--active",...e,key:t})))})]})}function fe(e){let{mobile:t=!1,...n}=e;const r=t?pe:de;return(0,u.jsx)(r,{...n})}var ge=n(94711);function he(e){let{width:t=20,height:n=20,...r}=e;return(0,u.jsx)("svg",{viewBox:"0 0 24 24",width:t,height:n,"aria-hidden":!0,...r,children:(0,u.jsx)("path",{fill:"currentColor",d:"M12.87 15.07l-2.54-2.51.03-.03c1.74-1.94 2.98-4.17 3.71-6.53H17V4h-7V2H8v2H1v1.99h11.17C11.5 7.92 10.44 9.75 9 11.35 8.07 10.32 7.3 9.19 6.69 8h-2c.73 1.63 1.73 3.17 2.98 4.56l-5.09 5.02L4 19l5-5 3.11 3.11.76-2.04zM18.5 10h-2L12 22h2l1.12-3h4.75L21 22h2l-4.5-12zm-2.62 7l1.62-4.33L19.12 17h-3.24z"})})}const me="iconLanguage_nlXk";var be=n(73935);function ye(){return r.createElement("svg",{width:"15",height:"15",className:"DocSearch-Control-Key-Icon"},r.createElement("path",{d:"M4.505 4.496h2M5.505 5.496v5M8.216 4.496l.055 5.993M10 7.5c.333.333.5.667.5 1v2M12.326 4.5v5.996M8.384 4.496c1.674 0 2.116 0 2.116 1.5s-.442 1.5-2.116 1.5M3.205 9.303c-.09.448-.277 1.21-1.241 1.203C1 10.5.5 9.513.5 8V7c0-1.57.5-2.5 1.464-2.494.964.006 1.134.598 1.24 1.342M12.553 10.5h1.953",strokeWidth:"1.2",stroke:"currentColor",fill:"none",strokeLinecap:"square"}))}var ve=n(20830),we=["translations"];function ke(){return ke=Object.assign||function(e){for(var t=1;te.length)&&(t=e.length);for(var n=0,r=new Array(t);n=0||(a[n]=e[n]);return a}(e,t);if(Object.getOwnPropertySymbols){var o=Object.getOwnPropertySymbols(e);for(r=0;r=0||Object.prototype.propertyIsEnumerable.call(e,n)&&(a[n]=e[n])}return a}var Ee="Ctrl";var Ce=r.forwardRef((function(e,t){var n=e.translations,a=void 0===n?{}:n,o=_e(e,we),i=a.buttonText,s=void 0===i?"Search":i,l=a.buttonAriaLabel,c=void 0===l?"Search":l,u=xe((0,r.useState)(null),2),d=u[0],p=u[1];return(0,r.useEffect)((function(){"undefined"!=typeof navigator&&(/(Mac|iPhone|iPod|iPad)/i.test(navigator.platform)?p("\u2318"):p(Ee))}),[]),r.createElement("button",ke({type:"button",className:"DocSearch DocSearch-Button","aria-label":c},o,{ref:t}),r.createElement("span",{className:"DocSearch-Button-Container"},r.createElement(ve.W,null),r.createElement("span",{className:"DocSearch-Button-Placeholder"},s)),r.createElement("span",{className:"DocSearch-Button-Keys"},null!==d&&r.createElement(r.Fragment,null,r.createElement("kbd",{className:"DocSearch-Button-Key"},d===Ee?r.createElement(ye,null):d),r.createElement("kbd",{className:"DocSearch-Button-Key"},"K"))))})),Te=n(35742),Ae=n(66177),je=n(239),Ne=n(43320);const Le={button:{buttonText:(0,l.I)({id:"theme.SearchBar.label",message:"Search",description:"The ARIA label and placeholder for search button"}),buttonAriaLabel:(0,l.I)({id:"theme.SearchBar.label",message:"Search",description:"The ARIA label and placeholder for search button"})},modal:{searchBox:{resetButtonTitle:(0,l.I)({id:"theme.SearchModal.searchBox.resetButtonTitle",message:"Clear the query",description:"The label and ARIA label for search box reset button"}),resetButtonAriaLabel:(0,l.I)({id:"theme.SearchModal.searchBox.resetButtonTitle",message:"Clear the query",description:"The label and ARIA label for search box reset button"}),cancelButtonText:(0,l.I)({id:"theme.SearchModal.searchBox.cancelButtonText",message:"Cancel",description:"The label and ARIA label for search box cancel button"}),cancelButtonAriaLabel:(0,l.I)({id:"theme.SearchModal.searchBox.cancelButtonText",message:"Cancel",description:"The label and ARIA label for search box cancel button"})},startScreen:{recentSearchesTitle:(0,l.I)({id:"theme.SearchModal.startScreen.recentSearchesTitle",message:"Recent",description:"The title for recent searches"}),noRecentSearchesText:(0,l.I)({id:"theme.SearchModal.startScreen.noRecentSearchesText",message:"No recent searches",description:"The text when no recent searches"}),saveRecentSearchButtonTitle:(0,l.I)({id:"theme.SearchModal.startScreen.saveRecentSearchButtonTitle",message:"Save this search",description:"The label for save recent search button"}),removeRecentSearchButtonTitle:(0,l.I)({id:"theme.SearchModal.startScreen.removeRecentSearchButtonTitle",message:"Remove this search from history",description:"The label for remove recent search button"}),favoriteSearchesTitle:(0,l.I)({id:"theme.SearchModal.startScreen.favoriteSearchesTitle",message:"Favorite",description:"The title for favorite searches"}),removeFavoriteSearchButtonTitle:(0,l.I)({id:"theme.SearchModal.startScreen.removeFavoriteSearchButtonTitle",message:"Remove this search from favorites",description:"The label for remove favorite search button"})},errorScreen:{titleText:(0,l.I)({id:"theme.SearchModal.errorScreen.titleText",message:"Unable to fetch results",description:"The title for error screen of search modal"}),helpText:(0,l.I)({id:"theme.SearchModal.errorScreen.helpText",message:"You might want to check your network connection.",description:"The help text for error screen of search modal"})},footer:{selectText:(0,l.I)({id:"theme.SearchModal.footer.selectText",message:"to select",description:"The explanatory text of the action for the enter key"}),selectKeyAriaLabel:(0,l.I)({id:"theme.SearchModal.footer.selectKeyAriaLabel",message:"Enter key",description:"The ARIA label for the Enter key button that makes the selection"}),navigateText:(0,l.I)({id:"theme.SearchModal.footer.navigateText",message:"to navigate",description:"The explanatory text of the action for the Arrow up and Arrow down key"}),navigateUpKeyAriaLabel:(0,l.I)({id:"theme.SearchModal.footer.navigateUpKeyAriaLabel",message:"Arrow up",description:"The ARIA label for the Arrow up key button that makes the navigation"}),navigateDownKeyAriaLabel:(0,l.I)({id:"theme.SearchModal.footer.navigateDownKeyAriaLabel",message:"Arrow down",description:"The ARIA label for the Arrow down key button that makes the navigation"}),closeText:(0,l.I)({id:"theme.SearchModal.footer.closeText",message:"to close",description:"The explanatory text of the action for Escape key"}),closeKeyAriaLabel:(0,l.I)({id:"theme.SearchModal.footer.closeKeyAriaLabel",message:"Escape key",description:"The ARIA label for the Escape key button that close the modal"}),searchByText:(0,l.I)({id:"theme.SearchModal.footer.searchByText",message:"Search by",description:"The text explain that the search is making by Algolia"})},noResultsScreen:{noResultsText:(0,l.I)({id:"theme.SearchModal.noResultsScreen.noResultsText",message:"No results for",description:"The text explains that there are no results for the following search"}),suggestedQueryText:(0,l.I)({id:"theme.SearchModal.noResultsScreen.suggestedQueryText",message:"Try searching for",description:"The text for the suggested query when no results are found for the following search"}),reportMissingResultsText:(0,l.I)({id:"theme.SearchModal.noResultsScreen.reportMissingResultsText",message:"Believe this query should return results?",description:"The text for the question where the user thinks there are missing results"}),reportMissingResultsLinkText:(0,l.I)({id:"theme.SearchModal.noResultsScreen.reportMissingResultsLinkText",message:"Let us know.",description:"The text for the link to report missing results"})}},placeholder:(0,l.I)({id:"theme.SearchModal.placeholder",message:"Search docs",description:"The placeholder of the input of the DocSearch pop-up modal"})};let Pe=null;function Ie(e){let{hit:t,children:n}=e;return(0,u.jsx)(Q.Z,{to:t.url,children:n})}function Re(e){let{state:t,onClose:n}=e;const r=(0,Ae.M)();return(0,u.jsx)(Q.Z,{to:r(t.query),onClick:n,children:(0,u.jsx)(l.Z,{id:"theme.SearchBar.seeAll",values:{count:t.context.nbHits},children:"See all {count} results"})})}function Oe(e){let{contextualSearch:t,externalUrlRegex:a,...o}=e;const{siteMetadata:i}=(0,le.Z)(),l=(0,je.l)(),c=function(){const{locale:e,tags:t}=(0,Ne._q)();return[`language:${e}`,t.map((e=>`docusaurus_tag:${e}`))]}(),d=o.searchParameters?.facetFilters??[],p=t?function(e,t){const n=e=>"string"==typeof e?[e]:e;return[...n(e),...n(t)]}(c,d):d,f={...o.searchParameters,facetFilters:p},g=(0,s.k6)(),h=(0,r.useRef)(null),m=(0,r.useRef)(null),[b,y]=(0,r.useState)(!1),[v,w]=(0,r.useState)(void 0),k=(0,r.useCallback)((()=>Pe?Promise.resolve():Promise.all([n.e(1426).then(n.bind(n,61426)),Promise.all([n.e(532),n.e(6945)]).then(n.bind(n,46945)),Promise.all([n.e(532),n.e(8894)]).then(n.bind(n,18894))]).then((e=>{let[{DocSearchModal:t}]=e;Pe=t}))),[]),x=(0,r.useCallback)((()=>{k().then((()=>{h.current=document.createElement("div"),document.body.insertBefore(h.current,document.body.firstChild),y(!0)}))}),[k,y]),S=(0,r.useCallback)((()=>{y(!1),h.current?.remove()}),[y]),_=(0,r.useCallback)((e=>{k().then((()=>{y(!0),w(e.key)}))}),[k,y,w]),E=(0,r.useRef)({navigate(e){let{itemUrl:t}=e;(0,ee.F)(a,t)?window.location.href=t:g.push(t)}}).current,C=(0,r.useRef)((e=>o.transformItems?o.transformItems(e):e.map((e=>({...e,url:l(e.url)}))))).current,T=(0,r.useMemo)((()=>e=>(0,u.jsx)(Re,{...e,onClose:S})),[S]),A=(0,r.useCallback)((e=>(e.addAlgoliaAgent("docusaurus",i.docusaurusVersion),e)),[i.docusaurusVersion]);return function(e){var t=e.isOpen,n=e.onOpen,a=e.onClose,o=e.onInput,i=e.searchButtonRef;r.useEffect((function(){function e(e){var r;(27===e.keyCode&&t||"k"===(null===(r=e.key)||void 0===r?void 0:r.toLowerCase())&&(e.metaKey||e.ctrlKey)||!function(e){var t=e.target,n=t.tagName;return t.isContentEditable||"INPUT"===n||"SELECT"===n||"TEXTAREA"===n}(e)&&"/"===e.key&&!t)&&(e.preventDefault(),t?a():document.body.classList.contains("DocSearch--active")||document.body.classList.contains("DocSearch--active")||n()),i&&i.current===document.activeElement&&o&&/[a-zA-Z0-9]/.test(String.fromCharCode(e.keyCode))&&o(e)}return window.addEventListener("keydown",e),function(){window.removeEventListener("keydown",e)}}),[t,n,a,o,i])}({isOpen:b,onOpen:x,onClose:S,onInput:_,searchButtonRef:m}),(0,u.jsxs)(u.Fragment,{children:[(0,u.jsx)(Te.Z,{children:(0,u.jsx)("link",{rel:"preconnect",href:`https://${o.appId}-dsn.algolia.net`,crossOrigin:"anonymous"})}),(0,u.jsx)(Ce,{onTouchStart:k,onFocus:k,onMouseOver:k,onClick:x,ref:m,translations:Le.button}),b&&Pe&&h.current&&(0,be.createPortal)((0,u.jsx)(Pe,{onClose:S,initialScrollY:window.scrollY,initialQuery:v,navigator:E,transformItems:C,hitComponent:Ie,transformSearchClient:A,...o.searchPagePath&&{resultsFooterComponent:T},...o,searchParameters:f,placeholder:Le.placeholder,translations:Le.modal}),h.current)]})}function Fe(){const{siteConfig:e}=(0,le.Z)();return(0,u.jsx)(Oe,{...e.themeConfig.algolia})}const Me={navbarSearchContainer:"navbarSearchContainer_Bca1"};function De(e){let{children:t,className:n}=e;return(0,u.jsx)("div",{className:(0,a.Z)(n,Me.navbarSearchContainer),children:t})}var Be=n(80143),ze=n(53438);var $e=n(60373);const Ue=e=>e.docs.find((t=>t.id===e.mainDocId));const Ze={default:oe,localeDropdown:function(e){let{mobile:t,dropdownItemsBefore:n,dropdownItemsAfter:r,queryString:a="",...o}=e;const{i18n:{currentLocale:i,locales:c,localeConfigs:d}}=(0,le.Z)(),p=(0,ge.l)(),{search:f,hash:g}=(0,s.TH)(),h=[...n,...c.map((e=>{const n=`${`pathname://${p.createUrl({locale:e,fullyQualified:!1})}`}${f}${g}${a}`;return{label:d[e].label,lang:d[e].htmlLang,to:n,target:"_self",autoAddBaseUrl:!1,className:e===i?t?"menu__link--active":"dropdown__link--active":""}})),...r],m=t?(0,l.I)({message:"Languages",id:"theme.navbar.mobileLanguageDropdown.label",description:"The label for the mobile language switcher dropdown"}):d[i].label;return(0,u.jsx)(fe,{...o,mobile:t,label:(0,u.jsxs)(u.Fragment,{children:[(0,u.jsx)(he,{className:me}),m]}),items:h})},search:function(e){let{mobile:t,className:n}=e;return t?null:(0,u.jsx)(De,{className:n,children:(0,u.jsx)(Fe,{})})},dropdown:fe,html:function(e){let{value:t,className:n,mobile:r=!1,isDropdownItem:o=!1}=e;const i=o?"li":"div";return(0,u.jsx)(i,{className:(0,a.Z)({navbar__item:!r&&!o,"menu__list-item":r},n),dangerouslySetInnerHTML:{__html:t}})},doc:function(e){let{docId:t,label:n,docsPluginId:r,...a}=e;const{activeDoc:o}=(0,Be.Iw)(r),i=(0,ze.vY)(t,r),s=o?.path===i?.path;return null===i||i.unlisted&&!s?null:(0,u.jsx)(oe,{exact:!0,...a,isActive:()=>s||!!o?.sidebar&&o.sidebar===i.sidebar,label:n??i.id,to:i.path})},docSidebar:function(e){let{sidebarId:t,label:n,docsPluginId:r,...a}=e;const{activeDoc:o}=(0,Be.Iw)(r),i=(0,ze.oz)(t,r).link;if(!i)throw new Error(`DocSidebarNavbarItem: Sidebar with ID "${t}" doesn't have anything to be linked to.`);return(0,u.jsx)(oe,{exact:!0,...a,isActive:()=>o?.sidebar===t,label:n??i.label,to:i.path})},docsVersion:function(e){let{label:t,to:n,docsPluginId:r,...a}=e;const o=(0,ze.lO)(r)[0],i=t??o.label,s=n??(e=>e.docs.find((t=>t.id===e.mainDocId)))(o).path;return(0,u.jsx)(oe,{...a,label:i,to:s})},docsVersionDropdown:function(e){let{mobile:t,docsPluginId:n,dropdownActiveClassDisabled:r,dropdownItemsBefore:a,dropdownItemsAfter:o,...i}=e;const{search:c,hash:d}=(0,s.TH)(),p=(0,Be.Iw)(n),f=(0,Be.gB)(n),{savePreferredVersionName:g}=(0,$e.J)(n),h=[...a,...f.map((e=>{const t=p.alternateDocVersions[e.name]??Ue(e);return{label:e.label,to:`${t.path}${c}${d}`,isActive:()=>e===p.activeVersion,onClick:()=>g(e.name)}})),...o],m=(0,ze.lO)(n)[0],b=t&&h.length>1?(0,l.I)({id:"theme.navbar.mobileVersionsDropdown.label",message:"Versions",description:"The label for the navbar versions dropdown on mobile view"}):m.label,y=t&&h.length>1?void 0:Ue(m).path;return h.length<=1?(0,u.jsx)(oe,{...i,mobile:t,label:b,to:y,isActive:r?()=>!1:void 0}):(0,u.jsx)(fe,{...i,mobile:t,label:b,to:y,items:h,isActive:r?()=>!1:void 0})}};function He(e){let{type:t,...n}=e;const r=function(e,t){return e&&"default"!==e?e:"items"in t?"dropdown":"default"}(t,n),a=Ze[r];if(!a)throw new Error(`No NavbarItem component found for type "${t}".`);return(0,u.jsx)(a,{...n})}function Ve(){const e=(0,j.e)(),t=(0,w.L)().navbar.items;return(0,u.jsx)("ul",{className:"menu__list",children:t.map(((t,n)=>(0,r.createElement)(He,{mobile:!0,...t,onClick:()=>e.toggle(),key:n})))})}function We(e){return(0,u.jsx)("button",{...e,type:"button",className:"clean-btn navbar-sidebar__back",children:(0,u.jsx)(l.Z,{id:"theme.navbar.mobileSidebarSecondaryMenu.backButtonLabel",description:"The label of the back button to return to main menu, inside the mobile navbar sidebar secondary menu (notably used to display the docs sidebar)",children:"\u2190 Back to main menu"})})}function Ge(){const e=0===(0,w.L)().navbar.items.length,t=F();return(0,u.jsxs)(u.Fragment,{children:[!e&&(0,u.jsx)(We,{onClick:()=>t.hide()}),t.content]})}function qe(){const e=(0,j.e)();var t;return void 0===(t=e.shown)&&(t=!0),(0,r.useEffect)((()=>(document.body.style.overflow=t?"hidden":"visible",()=>{document.body.style.overflow="visible"})),[t]),e.shouldRender?(0,u.jsx)(M,{header:(0,u.jsx)(Y,{}),primaryMenu:(0,u.jsx)(Ve,{}),secondaryMenu:(0,u.jsx)(Ge,{})}):null}const Ke={navbarHideable:"navbarHideable_m1mJ",navbarHidden:"navbarHidden_jGov"};function Ye(e){return(0,u.jsx)("div",{role:"presentation",...e,className:(0,a.Z)("navbar-sidebar__backdrop",e.className)})}function Qe(e){let{children:t}=e;const{navbar:{hideOnScroll:n,style:o}}=(0,w.L)(),i=(0,j.e)(),{navbarRef:s,isNavbarVisible:d}=function(e){const[t,n]=(0,r.useState)(e),a=(0,r.useRef)(!1),o=(0,r.useRef)(0),i=(0,r.useCallback)((e=>{null!==e&&(o.current=e.getBoundingClientRect().height)}),[]);return(0,N.RF)(((t,r)=>{let{scrollY:i}=t;if(!e)return;if(i=s?n(!1):i+c{if(!e)return;const r=t.location.hash;if(r?document.getElementById(r.substring(1)):void 0)return a.current=!0,void n(!1);n(!0)})),{navbarRef:i,isNavbarVisible:t}}(n);return(0,u.jsxs)("nav",{ref:s,"aria-label":(0,l.I)({id:"theme.NavBar.navAriaLabel",message:"Main",description:"The ARIA label for the main navigation"}),className:(0,a.Z)("navbar","navbar--fixed-top",n&&[Ke.navbarHideable,!d&&Ke.navbarHidden],{"navbar--dark":"dark"===o,"navbar--primary":"primary"===o,"navbar-sidebar--show":i.shown}),children:[t,(0,u.jsx)(Ye,{onClick:i.toggle}),(0,u.jsx)(qe,{})]})}var Xe=n(69690);const Je="right";function et(e){let{width:t=30,height:n=30,className:r,...a}=e;return(0,u.jsx)("svg",{className:r,width:t,height:n,viewBox:"0 0 30 30","aria-hidden":"true",...a,children:(0,u.jsx)("path",{stroke:"currentColor",strokeLinecap:"round",strokeMiterlimit:"10",strokeWidth:"2",d:"M4 7h22M4 15h22M4 23h22"})})}function tt(){const{toggle:e,shown:t}=(0,j.e)();return(0,u.jsx)("button",{onClick:e,"aria-label":(0,l.I)({id:"theme.docs.sidebar.toggleSidebarButtonAriaLabel",message:"Toggle navigation bar",description:"The ARIA label for hamburger menu button of mobile navigation"}),"aria-expanded":t,className:"navbar__toggle clean-btn",type:"button",children:(0,u.jsx)(et,{})})}const nt={colorModeToggle:"colorModeToggle_DEke"};function rt(e){let{items:t}=e;return(0,u.jsx)(u.Fragment,{children:t.map(((e,t)=>(0,u.jsx)(Xe.QW,{onError:t=>new Error(`A theme navbar item failed to render.\nPlease double-check the following navbar item (themeConfig.navbar.items) of your Docusaurus config:\n${JSON.stringify(e,null,2)}`,{cause:t}),children:(0,u.jsx)(He,{...e})},t)))})}function at(e){let{left:t,right:n}=e;return(0,u.jsxs)("div",{className:"navbar__inner",children:[(0,u.jsx)("div",{className:"navbar__items",children:t}),(0,u.jsx)("div",{className:"navbar__items navbar__items--right",children:n})]})}function ot(){const e=(0,j.e)(),t=(0,w.L)().navbar.items,[n,r]=function(e){function t(e){return"left"===(e.position??Je)}return[e.filter(t),e.filter((e=>!t(e)))]}(t),a=t.find((e=>"search"===e.type));return(0,u.jsx)(at,{left:(0,u.jsxs)(u.Fragment,{children:[!e.disabled&&(0,u.jsx)(tt,{}),(0,u.jsx)(q,{}),(0,u.jsx)(rt,{items:n})]}),right:(0,u.jsxs)(u.Fragment,{children:[(0,u.jsx)(rt,{items:r}),(0,u.jsx)(W,{className:nt.colorModeToggle}),!a&&(0,u.jsx)(De,{children:(0,u.jsx)(Fe,{})})]})})}function it(){return(0,u.jsx)(Qe,{children:(0,u.jsx)(ot,{})})}function st(e){let{item:t}=e;const{to:n,href:r,label:a,prependBaseUrlToHref:o,...i}=t,s=(0,X.Z)(n),l=(0,X.Z)(r,{forcePrependBaseUrl:!0});return(0,u.jsxs)(Q.Z,{className:"footer__link-item",...r?{href:o?l:r}:{to:s},...i,children:[a,r&&!(0,J.Z)(r)&&(0,u.jsx)(te.Z,{})]})}function lt(e){let{item:t}=e;return t.html?(0,u.jsx)("li",{className:"footer__item",dangerouslySetInnerHTML:{__html:t.html}}):(0,u.jsx)("li",{className:"footer__item",children:(0,u.jsx)(st,{item:t})},t.href??t.to)}function ct(e){let{column:t}=e;return(0,u.jsxs)("div",{className:"col footer__col",children:[(0,u.jsx)("div",{className:"footer__title",children:t.title}),(0,u.jsx)("ul",{className:"footer__items clean-list",children:t.items.map(((e,t)=>(0,u.jsx)(lt,{item:e},t)))})]})}function ut(e){let{columns:t}=e;return(0,u.jsx)("div",{className:"row footer__links",children:t.map(((e,t)=>(0,u.jsx)(ct,{column:e},t)))})}function dt(){return(0,u.jsx)("span",{className:"footer__link-separator",children:"\xb7"})}function pt(e){let{item:t}=e;return t.html?(0,u.jsx)("span",{className:"footer__link-item",dangerouslySetInnerHTML:{__html:t.html}}):(0,u.jsx)(st,{item:t})}function ft(e){let{links:t}=e;return(0,u.jsx)("div",{className:"footer__links text--center",children:(0,u.jsx)("div",{className:"footer__links",children:t.map(((e,n)=>(0,u.jsxs)(r.Fragment,{children:[(0,u.jsx)(pt,{item:e}),t.length!==n+1&&(0,u.jsx)(dt,{})]},n)))})})}function gt(e){let{links:t}=e;return function(e){return"title"in e[0]}(t)?(0,u.jsx)(ut,{columns:t}):(0,u.jsx)(ft,{links:t})}var ht=n(19965);const mt={footerLogoLink:"footerLogoLink_BH7S"};function bt(e){let{logo:t}=e;const{withBaseUrl:n}=(0,X.C)(),r={light:n(t.src),dark:n(t.srcDark??t.src)};return(0,u.jsx)(ht.Z,{className:(0,a.Z)("footer__logo",t.className),alt:t.alt,sources:r,width:t.width,height:t.height,style:t.style})}function yt(e){let{logo:t}=e;return t.href?(0,u.jsx)(Q.Z,{href:t.href,className:mt.footerLogoLink,target:t.target,children:(0,u.jsx)(bt,{logo:t})}):(0,u.jsx)(bt,{logo:t})}function vt(e){let{copyright:t}=e;return(0,u.jsx)("div",{className:"footer__copyright",dangerouslySetInnerHTML:{__html:t}})}function wt(e){let{style:t,links:n,logo:r,copyright:o}=e;return(0,u.jsx)("footer",{className:(0,a.Z)("footer",{"footer--dark":"dark"===t}),children:(0,u.jsxs)("div",{className:"container container-fluid",children:[n,(r||o)&&(0,u.jsxs)("div",{className:"footer__bottom text--center",children:[r&&(0,u.jsx)("div",{className:"margin-bottom--sm",children:r}),o]})]})})}function kt(){const{footer:e}=(0,w.L)();if(!e)return null;const{copyright:t,links:n,logo:r,style:a}=e;return(0,u.jsx)(wt,{style:a,links:n&&n.length>0&&(0,u.jsx)(gt,{links:n}),logo:r&&(0,u.jsx)(yt,{logo:r}),copyright:t&&(0,u.jsx)(vt,{copyright:t})})}const xt=r.memo(kt),St=(0,L.Qc)([D.S,k.pl,N.OC,$e.L5,i.VC,function(e){let{children:t}=e;return(0,u.jsx)(P.n2,{children:(0,u.jsx)(j.M,{children:(0,u.jsx)(R,{children:t})})})}]);function _t(e){let{children:t}=e;return(0,u.jsx)(St,{children:t})}var Et=n(92503);function Ct(e){let{error:t,tryAgain:n}=e;return(0,u.jsx)("main",{className:"container margin-vert--xl",children:(0,u.jsx)("div",{className:"row",children:(0,u.jsxs)("div",{className:"col col--6 col--offset-3",children:[(0,u.jsx)(Et.Z,{as:"h1",className:"hero__title",children:(0,u.jsx)(l.Z,{id:"theme.ErrorPageContent.title",description:"The title of the fallback page when the page crashed",children:"This page crashed."})}),(0,u.jsx)("div",{className:"margin-vert--lg",children:(0,u.jsx)(Xe.Cw,{onClick:n,className:"button button--primary shadow--lw"})}),(0,u.jsx)("hr",{}),(0,u.jsx)("div",{className:"margin-vert--md",children:(0,u.jsx)(Xe.aG,{error:t})})]})})})}const Tt={mainWrapper:"mainWrapper_z2l0"};function At(e){const{children:t,noFooter:n,wrapperClassName:r,title:s,description:l}=e;return(0,b.t)(),(0,u.jsxs)(_t,{children:[(0,u.jsx)(i.d,{title:s,description:l}),(0,u.jsx)(v,{}),(0,u.jsx)(A,{}),(0,u.jsx)(it,{}),(0,u.jsx)("div",{id:d,className:(0,a.Z)(m.k.wrapper.main,Tt.mainWrapper,r),children:(0,u.jsx)(o.Z,{fallback:e=>(0,u.jsx)(Ct,{...e}),children:t})}),!n&&(0,u.jsx)(xt,{})]})}},21327:(e,t,n)=>{"use strict";n.d(t,{Z:()=>u});n(67294);var r=n(33692),a=n(44996),o=n(52263),i=n(86668),s=n(19965),l=n(85893);function c(e){let{logo:t,alt:n,imageClassName:r}=e;const o={light:(0,a.Z)(t.src),dark:(0,a.Z)(t.srcDark||t.src)},i=(0,l.jsx)(s.Z,{className:t.className,sources:o,height:t.height,width:t.width,alt:n,style:t.style});return r?(0,l.jsx)("div",{className:r,children:i}):i}function u(e){const{siteConfig:{title:t}}=(0,o.Z)(),{navbar:{title:n,logo:s}}=(0,i.L)(),{imageClassName:u,titleClassName:d,...p}=e,f=(0,a.Z)(s?.href||"/"),g=n?"":t,h=s?.alt??g;return(0,l.jsxs)(r.Z,{to:f,...p,...s?.target&&{target:s.target},children:[s&&(0,l.jsx)(c,{logo:s,alt:h,imageClassName:u}),null!=n&&(0,l.jsx)("b",{className:d,children:n})]})}},90197:(e,t,n)=>{"use strict";n.d(t,{Z:()=>o});n(67294);var r=n(35742),a=n(85893);function o(e){let{locale:t,version:n,tag:o}=e;const i=t;return(0,a.jsxs)(r.Z,{children:[t&&(0,a.jsx)("meta",{name:"docusaurus_locale",content:t}),n&&(0,a.jsx)("meta",{name:"docusaurus_version",content:n}),o&&(0,a.jsx)("meta",{name:"docusaurus_tag",content:o}),i&&(0,a.jsx)("meta",{name:"docsearch:language",content:i}),n&&(0,a.jsx)("meta",{name:"docsearch:version",content:n}),o&&(0,a.jsx)("meta",{name:"docsearch:docusaurus_tag",content:o})]})}},19965:(e,t,n)=>{"use strict";n.d(t,{Z:()=>u});var r=n(67294),a=n(788),o=n(72389),i=n(92949);const s={themedComponent:"themedComponent_mlkZ","themedComponent--light":"themedComponent--light_NVdE","themedComponent--dark":"themedComponent--dark_xIcU"};var l=n(85893);function c(e){let{className:t,children:n}=e;const c=(0,o.Z)(),{colorMode:u}=(0,i.I)();return(0,l.jsx)(l.Fragment,{children:(c?"dark"===u?["dark"]:["light"]:["light","dark"]).map((e=>{const o=n({theme:e,className:(0,a.Z)(t,s.themedComponent,s[`themedComponent--${e}`])});return(0,l.jsx)(r.Fragment,{children:o},e)}))})}function u(e){const{sources:t,className:n,alt:r,...a}=e;return(0,l.jsx)(c,{className:n,children:e=>{let{theme:n,className:o}=e;return(0,l.jsx)("img",{src:t[n],alt:r,className:o,...a})}})}},86043:(e,t,n)=>{"use strict";n.d(t,{u:()=>c,z:()=>b});var r=n(67294),a=n(10412),o=n(20469),i=n(91442),s=n(85893);const l="ease-in-out";function c(e){let{initialState:t}=e;const[n,a]=(0,r.useState)(t??!1),o=(0,r.useCallback)((()=>{a((e=>!e))}),[]);return{collapsed:n,setCollapsed:a,toggleCollapsed:o}}const u={display:"none",overflow:"hidden",height:"0px"},d={display:"block",overflow:"visible",height:"auto"};function p(e,t){const n=t?u:d;e.style.display=n.display,e.style.overflow=n.overflow,e.style.height=n.height}function f(e){let{collapsibleRef:t,collapsed:n,animation:a}=e;const o=(0,r.useRef)(!1);(0,r.useEffect)((()=>{const e=t.current;function r(){const t=e.scrollHeight,n=a?.duration??function(e){if((0,i.n)())return 1;const t=e/36;return Math.round(10*(4+15*t**.25+t/5))}(t);return{transition:`height ${n}ms ${a?.easing??l}`,height:`${t}px`}}function s(){const t=r();e.style.transition=t.transition,e.style.height=t.height}if(!o.current)return p(e,n),void(o.current=!0);return e.style.willChange="height",function(){const t=requestAnimationFrame((()=>{n?(s(),requestAnimationFrame((()=>{e.style.height=u.height,e.style.overflow=u.overflow}))):(e.style.display="block",requestAnimationFrame((()=>{s()})))}));return()=>cancelAnimationFrame(t)}()}),[t,n,a])}function g(e){if(!a.Z.canUseDOM)return e?u:d}function h(e){let{as:t="div",collapsed:n,children:a,animation:o,onCollapseTransitionEnd:i,className:l,disableSSRStyle:c}=e;const u=(0,r.useRef)(null);return f({collapsibleRef:u,collapsed:n,animation:o}),(0,s.jsx)(t,{ref:u,style:c?void 0:g(n),onTransitionEnd:e=>{"height"===e.propertyName&&(p(u.current,n),i?.(n))},className:l,children:a})}function m(e){let{collapsed:t,...n}=e;const[a,i]=(0,r.useState)(!t),[l,c]=(0,r.useState)(t);return(0,o.Z)((()=>{t||i(!0)}),[t]),(0,o.Z)((()=>{a&&c(t)}),[a,t]),a?(0,s.jsx)(h,{...n,collapsed:l}):null}function b(e){let{lazy:t,...n}=e;const r=t?m:h;return(0,s.jsx)(r,{...n})}},59689:(e,t,n)=>{"use strict";n.d(t,{nT:()=>h,pl:()=>g});var r=n(67294),a=n(72389),o=n(50012),i=n(902),s=n(86668),l=n(85893);const c=(0,o.WA)("docusaurus.announcement.dismiss"),u=(0,o.WA)("docusaurus.announcement.id"),d=()=>"true"===c.get(),p=e=>c.set(String(e)),f=r.createContext(null);function g(e){let{children:t}=e;const n=function(){const{announcementBar:e}=(0,s.L)(),t=(0,a.Z)(),[n,o]=(0,r.useState)((()=>!!t&&d()));(0,r.useEffect)((()=>{o(d())}),[]);const i=(0,r.useCallback)((()=>{p(!0),o(!0)}),[]);return(0,r.useEffect)((()=>{if(!e)return;const{id:t}=e;let n=u.get();"annoucement-bar"===n&&(n="announcement-bar");const r=t!==n;u.set(t),r&&p(!1),!r&&d()||o(!1)}),[e]),(0,r.useMemo)((()=>({isActive:!!e&&!n,close:i})),[e,n,i])}();return(0,l.jsx)(f.Provider,{value:n,children:t})}function h(){const e=(0,r.useContext)(f);if(!e)throw new i.i6("AnnouncementBarProvider");return e}},92949:(e,t,n)=>{"use strict";n.d(t,{I:()=>b,S:()=>m});var r=n(67294),a=n(10412),o=n(902),i=n(50012),s=n(86668),l=n(85893);const c=r.createContext(void 0),u="theme",d=(0,i.WA)(u),p={light:"light",dark:"dark"},f=e=>e===p.dark?p.dark:p.light,g=e=>a.Z.canUseDOM?f(document.documentElement.getAttribute("data-theme")):f(e),h=e=>{d.set(f(e))};function m(e){let{children:t}=e;const n=function(){const{colorMode:{defaultMode:e,disableSwitch:t,respectPrefersColorScheme:n}}=(0,s.L)(),[a,o]=(0,r.useState)(g(e));(0,r.useEffect)((()=>{t&&d.del()}),[t]);const i=(0,r.useCallback)((function(t,r){void 0===r&&(r={});const{persist:a=!0}=r;t?(o(t),a&&h(t)):(o(n?window.matchMedia("(prefers-color-scheme: dark)").matches?p.dark:p.light:e),d.del())}),[n,e]);(0,r.useEffect)((()=>{document.documentElement.setAttribute("data-theme",f(a))}),[a]),(0,r.useEffect)((()=>{if(t)return;const e=e=>{if(e.key!==u)return;const t=d.get();null!==t&&i(f(t))};return window.addEventListener("storage",e),()=>window.removeEventListener("storage",e)}),[t,i]);const l=(0,r.useRef)(!1);return(0,r.useEffect)((()=>{if(t&&!n)return;const e=window.matchMedia("(prefers-color-scheme: dark)"),r=()=>{window.matchMedia("print").matches||l.current?l.current=window.matchMedia("print").matches:i(null)};return e.addListener(r),()=>e.removeListener(r)}),[i,t,n]),(0,r.useMemo)((()=>({colorMode:a,setColorMode:i,get isDarkTheme(){return a===p.dark},setLightTheme(){i(p.light)},setDarkTheme(){i(p.dark)}})),[a,i])}();return(0,l.jsx)(c.Provider,{value:n,children:t})}function b(){const e=(0,r.useContext)(c);if(null==e)throw new o.i6("ColorModeProvider","Please see https://docusaurus.io/docs/api/themes/configuration#use-color-mode.");return e}},60373:(e,t,n)=>{"use strict";n.d(t,{J:()=>v,L5:()=>b,Oh:()=>w});var r=n(67294),a=n(80143),o=n(29935),i=n(86668),s=n(53438),l=n(902),c=n(50012),u=n(85893);const d=e=>`docs-preferred-version-${e}`,p={save:(e,t,n)=>{(0,c.WA)(d(e),{persistence:t}).set(n)},read:(e,t)=>(0,c.WA)(d(e),{persistence:t}).get(),clear:(e,t)=>{(0,c.WA)(d(e),{persistence:t}).del()}},f=e=>Object.fromEntries(e.map((e=>[e,{preferredVersionName:null}])));const g=r.createContext(null);function h(){const e=(0,a._r)(),t=(0,i.L)().docs.versionPersistence,n=(0,r.useMemo)((()=>Object.keys(e)),[e]),[o,s]=(0,r.useState)((()=>f(n)));(0,r.useEffect)((()=>{s(function(e){let{pluginIds:t,versionPersistence:n,allDocsData:r}=e;function a(e){const t=p.read(e,n);return r[e].versions.some((e=>e.name===t))?{preferredVersionName:t}:(p.clear(e,n),{preferredVersionName:null})}return Object.fromEntries(t.map((e=>[e,a(e)])))}({allDocsData:e,versionPersistence:t,pluginIds:n}))}),[e,t,n]);return[o,(0,r.useMemo)((()=>({savePreferredVersion:function(e,n){p.save(e,t,n),s((t=>({...t,[e]:{preferredVersionName:n}})))}})),[t])]}function m(e){let{children:t}=e;const n=h();return(0,u.jsx)(g.Provider,{value:n,children:t})}function b(e){let{children:t}=e;return s.cE?(0,u.jsx)(m,{children:t}):(0,u.jsx)(u.Fragment,{children:t})}function y(){const e=(0,r.useContext)(g);if(!e)throw new l.i6("DocsPreferredVersionContextProvider");return e}function v(e){void 0===e&&(e=o.m);const t=(0,a.zh)(e),[n,i]=y(),{preferredVersionName:s}=n[e];return{preferredVersion:t.versions.find((e=>e.name===s))??null,savePreferredVersionName:(0,r.useCallback)((t=>{i.savePreferredVersion(e,t)}),[i,e])}}function w(){const e=(0,a._r)(),[t]=y();function n(n){const r=e[n],{preferredVersionName:a}=t[n];return r.versions.find((e=>e.name===a))??null}const r=Object.keys(e);return Object.fromEntries(r.map((e=>[e,n(e)])))}},1116:(e,t,n)=>{"use strict";n.d(t,{V:()=>c,b:()=>l});var r=n(67294),a=n(902),o=n(85893);const i=Symbol("EmptyContext"),s=r.createContext(i);function l(e){let{children:t,name:n,items:a}=e;const i=(0,r.useMemo)((()=>n&&a?{name:n,items:a}:null),[n,a]);return(0,o.jsx)(s.Provider,{value:i,children:t})}function c(){const e=(0,r.useContext)(s);if(e===i)throw new a.i6("DocsSidebarProvider");return e}},74477:(e,t,n)=>{"use strict";n.d(t,{E:()=>l,q:()=>s});var r=n(67294),a=n(902),o=n(85893);const i=r.createContext(null);function s(e){let{children:t,version:n}=e;return(0,o.jsx)(i.Provider,{value:n,children:t})}function l(){const e=(0,r.useContext)(i);if(null===e)throw new a.i6("DocsVersionProvider");return e}},93163:(e,t,n)=>{"use strict";n.d(t,{M:()=>p,e:()=>f});var r=n(67294),a=n(13102),o=n(87524),i=n(91980),s=n(86668),l=n(902),c=n(85893);const u=r.createContext(void 0);function d(){const e=function(){const e=(0,a.HY)(),{items:t}=(0,s.L)().navbar;return 0===t.length&&!e.component}(),t=(0,o.i)(),n=!e&&"mobile"===t,[l,c]=(0,r.useState)(!1);(0,i.Rb)((()=>{if(l)return c(!1),!1}));const u=(0,r.useCallback)((()=>{c((e=>!e))}),[]);return(0,r.useEffect)((()=>{"desktop"===t&&c(!1)}),[t]),(0,r.useMemo)((()=>({disabled:e,shouldRender:n,toggle:u,shown:l})),[e,n,u,l])}function p(e){let{children:t}=e;const n=d();return(0,c.jsx)(u.Provider,{value:n,children:t})}function f(){const e=r.useContext(u);if(void 0===e)throw new l.i6("NavbarMobileSidebarProvider");return e}},13102:(e,t,n)=>{"use strict";n.d(t,{HY:()=>l,Zo:()=>c,n2:()=>s});var r=n(67294),a=n(902),o=n(85893);const i=r.createContext(null);function s(e){let{children:t}=e;const n=(0,r.useState)({component:null,props:null});return(0,o.jsx)(i.Provider,{value:n,children:t})}function l(){const e=(0,r.useContext)(i);if(!e)throw new a.i6("NavbarSecondaryMenuContentProvider");return e[0]}function c(e){let{component:t,props:n}=e;const o=(0,r.useContext)(i);if(!o)throw new a.i6("NavbarSecondaryMenuContentProvider");const[,s]=o,l=(0,a.Ql)(n);return(0,r.useEffect)((()=>{s({component:t,props:l})}),[s,t,l]),(0,r.useEffect)((()=>()=>s({component:null,props:null})),[s]),null}},19727:(e,t,n)=>{"use strict";n.d(t,{h:()=>a,t:()=>o});var r=n(67294);const a="navigation-with-keyboard";function o(){(0,r.useEffect)((()=>{function e(e){"keydown"===e.type&&"Tab"===e.key&&document.body.classList.add(a),"mousedown"===e.type&&document.body.classList.remove(a)}return document.addEventListener("keydown",e),document.addEventListener("mousedown",e),()=>{document.body.classList.remove(a),document.removeEventListener("keydown",e),document.removeEventListener("mousedown",e)}}),[])}},66177:(e,t,n)=>{"use strict";n.d(t,{K:()=>s,M:()=>l});var r=n(67294),a=n(52263),o=n(91980);const i="q";function s(){return(0,o.Nc)(i)}function l(){const{siteConfig:{baseUrl:e,themeConfig:t}}=(0,a.Z)(),{algolia:{searchPagePath:n}}=t;return(0,r.useCallback)((t=>`${e}${n}?${i}=${encodeURIComponent(t)}`),[e,n])}},87524:(e,t,n)=>{"use strict";n.d(t,{i:()=>s});var r=n(67294),a=n(10412);const o={desktop:"desktop",mobile:"mobile",ssr:"ssr"},i=996;function s(e){let{desktopBreakpoint:t=i}=void 0===e?{}:e;const[n,s]=(0,r.useState)((()=>"ssr"));return(0,r.useEffect)((()=>{function e(){s(function(e){if(!a.Z.canUseDOM)throw new Error("getWindowSize() should only be called after React hydration");return window.innerWidth>e?o.desktop:o.mobile}(t))}return e(),window.addEventListener("resize",e),()=>{window.removeEventListener("resize",e)}}),[t]),n}},35281:(e,t,n)=>{"use strict";n.d(t,{k:()=>r});const r={page:{blogListPage:"blog-list-page",blogPostPage:"blog-post-page",blogTagsListPage:"blog-tags-list-page",blogTagPostListPage:"blog-tags-post-list-page",docsDocPage:"docs-doc-page",docsTagsListPage:"docs-tags-list-page",docsTagDocListPage:"docs-tags-doc-list-page",mdxPage:"mdx-page"},wrapper:{main:"main-wrapper",blogPages:"blog-wrapper",docsPages:"docs-wrapper",mdxPages:"mdx-wrapper"},common:{editThisPage:"theme-edit-this-page",lastUpdated:"theme-last-updated",backToTopButton:"theme-back-to-top-button",codeBlock:"theme-code-block",admonition:"theme-admonition",unlistedBanner:"theme-unlisted-banner",admonitionType:e=>`theme-admonition-${e}`},layout:{},docs:{docVersionBanner:"theme-doc-version-banner",docVersionBadge:"theme-doc-version-badge",docBreadcrumbs:"theme-doc-breadcrumbs",docMarkdown:"theme-doc-markdown",docTocMobile:"theme-doc-toc-mobile",docTocDesktop:"theme-doc-toc-desktop",docFooter:"theme-doc-footer",docFooterTagsRow:"theme-doc-footer-tags-row",docFooterEditMetaRow:"theme-doc-footer-edit-meta-row",docSidebarContainer:"theme-doc-sidebar-container",docSidebarMenu:"theme-doc-sidebar-menu",docSidebarItemCategory:"theme-doc-sidebar-item-category",docSidebarItemLink:"theme-doc-sidebar-item-link",docSidebarItemCategoryLevel:e=>`theme-doc-sidebar-item-category-level-${e}`,docSidebarItemLinkLevel:e=>`theme-doc-sidebar-item-link-level-${e}`},blog:{}}},91442:(e,t,n)=>{"use strict";function r(){return window.matchMedia("(prefers-reduced-motion: reduce)").matches}n.d(t,{n:()=>r})},53438:(e,t,n)=>{"use strict";n.d(t,{LM:()=>g,MN:()=>T,SN:()=>C,_F:()=>y,cE:()=>p,f:()=>w,jA:()=>h,lO:()=>S,oz:()=>_,s1:()=>x,vY:()=>E,xz:()=>f});var r=n(67294),a=n(16550),o=n(18790),i=n(80143),s=n(60373),l=n(74477),c=n(1116),u=n(67392),d=n(48596);const p=!!i._r;function f(e){const t=(0,l.E)();if(!e)return;const n=t.docs[e];if(!n)throw new Error(`no version doc found by id=${e}`);return n}function g(e){return"link"!==e.type||e.unlisted?"category"===e.type?function(e){if(e.href&&!e.linkUnlisted)return e.href;for(const t of e.items){const e=g(t);if(e)return e}}(e):void 0:e.href}function h(){const{pathname:e}=(0,a.TH)(),t=(0,c.V)();if(!t)throw new Error("Unexpected: cant find current sidebar in context");const n=k({sidebarItems:t.items,pathname:e,onlyCategories:!0}).slice(-1)[0];if(!n)throw new Error(`${e} is not associated with a category. useCurrentSidebarCategory() should only be used on category index pages.`);return n}const m=(e,t)=>void 0!==e&&(0,d.Mg)(e,t),b=(e,t)=>e.some((e=>y(e,t)));function y(e,t){return"link"===e.type?m(e.href,t):"category"===e.type&&(m(e.href,t)||b(e.items,t))}function v(e,t){switch(e.type){case"category":return y(e,t)||e.items.some((e=>v(e,t)));case"link":return!e.unlisted||y(e,t);default:return!0}}function w(e,t){return(0,r.useMemo)((()=>e.filter((e=>v(e,t)))),[e,t])}function k(e){let{sidebarItems:t,pathname:n,onlyCategories:r=!1}=e;const a=[];return function e(t){for(const o of t)if("category"===o.type&&((0,d.Mg)(o.href,n)||e(o.items))||"link"===o.type&&(0,d.Mg)(o.href,n)){return r&&"category"!==o.type||a.unshift(o),!0}return!1}(t),a}function x(){const e=(0,c.V)(),{pathname:t}=(0,a.TH)(),n=(0,i.gA)()?.pluginData.breadcrumbs;return!1!==n&&e?k({sidebarItems:e.items,pathname:t}):null}function S(e){const{activeVersion:t}=(0,i.Iw)(e),{preferredVersion:n}=(0,s.J)(e),a=(0,i.yW)(e);return(0,r.useMemo)((()=>(0,u.j)([t,n,a].filter(Boolean))),[t,n,a])}function _(e,t){const n=S(t);return(0,r.useMemo)((()=>{const t=n.flatMap((e=>e.sidebars?Object.entries(e.sidebars):[])),r=t.find((t=>t[0]===e));if(!r)throw new Error(`Can't find any sidebar with id "${e}" in version${n.length>1?"s":""} ${n.map((e=>e.name)).join(", ")}".\nAvailable sidebar ids are:\n- ${t.map((e=>e[0])).join("\n- ")}`);return r[1]}),[e,n])}function E(e,t){const n=S(t);return(0,r.useMemo)((()=>{const t=n.flatMap((e=>e.docs)),r=t.find((t=>t.id===e));if(!r){if(n.flatMap((e=>e.draftIds)).includes(e))return null;throw new Error(`Couldn't find any doc with id "${e}" in version${n.length>1?"s":""} "${n.map((e=>e.name)).join(", ")}".\nAvailable doc ids are:\n- ${(0,u.j)(t.map((e=>e.id))).join("\n- ")}`)}return r}),[e,n])}function C(e){let{route:t}=e;const n=(0,a.TH)(),r=(0,l.E)(),i=t.routes,s=i.find((e=>(0,a.LX)(n.pathname,e)));if(!s)return null;const c=s.sidebar,u=c?r.docsSidebars[c]:void 0;return{docElement:(0,o.H)(i),sidebarName:c,sidebarItems:u}}function T(e){return e.filter((e=>!("category"===e.type||"link"===e.type)||!!g(e)))}},69690:(e,t,n)=>{"use strict";n.d(t,{aG:()=>u,Ac:()=>c,Cw:()=>l,QW:()=>d});var r=n(67294),a=n(95999),o=n(18780);const i={errorBoundaryError:"errorBoundaryError_a6uf",errorBoundaryFallback:"errorBoundaryFallback_VBag"};var s=n(85893);function l(e){return(0,s.jsx)("button",{type:"button",...e,children:(0,s.jsx)(a.Z,{id:"theme.ErrorPageContent.tryAgain",description:"The label of the button to try again rendering when the React error boundary captures an error",children:"Try again"})})}function c(e){let{error:t,tryAgain:n}=e;return(0,s.jsxs)("div",{className:i.errorBoundaryFallback,children:[(0,s.jsx)("p",{children:t.message}),(0,s.jsx)(l,{onClick:n})]})}function u(e){let{error:t}=e;const n=(0,o.getErrorCausalChain)(t).map((e=>e.message)).join("\n\nCause:\n");return(0,s.jsx)("p",{className:i.errorBoundaryError,children:n})}class d extends r.Component{componentDidCatch(e,t){throw this.props.onError(e,t)}render(){return this.props.children}}},82128:(e,t,n)=>{"use strict";n.d(t,{p:()=>a});var r=n(52263);function a(e){const{siteConfig:t}=(0,r.Z)(),{title:n,titleDelimiter:a}=t;return e?.trim().length?`${e.trim()} ${a} ${n}`:n}},91980:(e,t,n)=>{"use strict";n.d(t,{Nc:()=>l,Rb:()=>i,_X:()=>s});var r=n(67294),a=n(16550),o=n(902);function i(e){!function(e){const t=(0,a.k6)(),n=(0,o.zX)(e);(0,r.useEffect)((()=>t.block(((e,t)=>n(e,t)))),[t,n])}(((t,n)=>{if("POP"===n)return e(t,n)}))}function s(e){return function(e){const t=(0,a.k6)();return(0,r.useSyncExternalStore)(t.listen,(()=>e(t)),(()=>e(t)))}((t=>null===e?null:new URLSearchParams(t.location.search).get(e)))}function l(e){const t=s(e)??"",n=function(){const e=(0,a.k6)();return(0,r.useCallback)(((t,n,r)=>{const a=new URLSearchParams(e.location.search);n?a.set(t,n):a.delete(t),(r?.push?e.push:e.replace)({search:a.toString()})}),[e])}();return[t,(0,r.useCallback)(((t,r)=>{n(e,t,r)}),[n,e])]}},67392:(e,t,n)=>{"use strict";function r(e,t){return void 0===t&&(t=(e,t)=>e===t),e.filter(((n,r)=>e.findIndex((e=>t(e,n)))!==r))}function a(e){return Array.from(new Set(e))}n.d(t,{j:()=>a,l:()=>r})},10833:(e,t,n)=>{"use strict";n.d(t,{FG:()=>f,d:()=>d,VC:()=>g});var r=n(67294),a=n(788),o=n(35742),i=n(30226);function s(){const e=r.useContext(i._);if(!e)throw new Error("Unexpected: no Docusaurus route context found");return e}var l=n(44996),c=n(82128),u=n(85893);function d(e){let{title:t,description:n,keywords:r,image:a,children:i}=e;const s=(0,c.p)(t),{withBaseUrl:d}=(0,l.C)(),p=a?d(a,{absolute:!0}):void 0;return(0,u.jsxs)(o.Z,{children:[t&&(0,u.jsx)("title",{children:s}),t&&(0,u.jsx)("meta",{property:"og:title",content:s}),n&&(0,u.jsx)("meta",{name:"description",content:n}),n&&(0,u.jsx)("meta",{property:"og:description",content:n}),r&&(0,u.jsx)("meta",{name:"keywords",content:Array.isArray(r)?r.join(","):r}),p&&(0,u.jsx)("meta",{property:"og:image",content:p}),p&&(0,u.jsx)("meta",{name:"twitter:image",content:p}),i]})}const p=r.createContext(void 0);function f(e){let{className:t,children:n}=e;const i=r.useContext(p),s=(0,a.Z)(i,t);return(0,u.jsxs)(p.Provider,{value:s,children:[(0,u.jsx)(o.Z,{children:(0,u.jsx)("html",{className:s})}),n]})}function g(e){let{children:t}=e;const n=s(),r=`plugin-${n.plugin.name.replace(/docusaurus-(?:plugin|theme)-(?:content-)?/gi,"")}`;const o=`plugin-id-${n.plugin.id}`;return(0,u.jsx)(f,{className:(0,a.Z)(r,o),children:t})}},902:(e,t,n)=>{"use strict";n.d(t,{D9:()=>s,Qc:()=>u,Ql:()=>c,i6:()=>l,zX:()=>i});var r=n(67294),a=n(20469),o=n(85893);function i(e){const t=(0,r.useRef)(e);return(0,a.Z)((()=>{t.current=e}),[e]),(0,r.useCallback)((function(){return t.current(...arguments)}),[])}function s(e){const t=(0,r.useRef)();return(0,a.Z)((()=>{t.current=e})),t.current}class l extends Error{constructor(e,t){super(),this.name="ReactContextError",this.message=`Hook ${this.stack?.split("\n")[1]?.match(/at (?:\w+\.)?(?\w+)/)?.groups.name??""} is called outside the <${e}>. ${t??""}`}}function c(e){const t=Object.entries(e);return t.sort(((e,t)=>e[0].localeCompare(t[0]))),(0,r.useMemo)((()=>e),t.flat())}function u(e){return t=>{let{children:n}=t;return(0,o.jsx)(o.Fragment,{children:e.reduceRight(((e,t)=>(0,o.jsx)(t,{children:e})),n)})}}},98022:(e,t,n)=>{"use strict";function r(e,t){return void 0!==e&&void 0!==t&&new RegExp(e,"gi").test(t)}n.d(t,{F:()=>r})},48596:(e,t,n)=>{"use strict";n.d(t,{Mg:()=>i,Ns:()=>s});var r=n(67294),a=n(723),o=n(52263);function i(e,t){const n=e=>(!e||e.endsWith("/")?e:`${e}/`)?.toLowerCase();return n(e)===n(t)}function s(){const{baseUrl:e}=(0,o.Z)().siteConfig;return(0,r.useMemo)((()=>function(e){let{baseUrl:t,routes:n}=e;function r(e){return e.path===t&&!0===e.exact}function a(e){return e.path===t&&!e.exact}return function e(t){if(0===t.length)return;return t.find(r)||e(t.filter(a).flatMap((e=>e.routes??[])))}(n)}({routes:a.Z,baseUrl:e})),[e])}},12466:(e,t,n)=>{"use strict";n.d(t,{Ct:()=>h,OC:()=>u,RF:()=>f,o5:()=>g});var r=n(67294),a=n(10412),o=n(72389),i=n(20469),s=n(902),l=n(85893);const c=r.createContext(void 0);function u(e){let{children:t}=e;const n=function(){const e=(0,r.useRef)(!0);return(0,r.useMemo)((()=>({scrollEventsEnabledRef:e,enableScrollEvents:()=>{e.current=!0},disableScrollEvents:()=>{e.current=!1}})),[])}();return(0,l.jsx)(c.Provider,{value:n,children:t})}function d(){const e=(0,r.useContext)(c);if(null==e)throw new s.i6("ScrollControllerProvider");return e}const p=()=>a.Z.canUseDOM?{scrollX:window.pageXOffset,scrollY:window.pageYOffset}:null;function f(e,t){void 0===t&&(t=[]);const{scrollEventsEnabledRef:n}=d(),a=(0,r.useRef)(p()),o=(0,s.zX)(e);(0,r.useEffect)((()=>{const e=()=>{if(!n.current)return;const e=p();o(e,a.current),a.current=e},t={passive:!0};return e(),window.addEventListener("scroll",e,t),()=>window.removeEventListener("scroll",e,t)}),[o,n,...t])}function g(){const e=d(),t=function(){const e=(0,r.useRef)({elem:null,top:0}),t=(0,r.useCallback)((t=>{e.current={elem:t,top:t.getBoundingClientRect().top}}),[]),n=(0,r.useCallback)((()=>{const{current:{elem:t,top:n}}=e;if(!t)return{restored:!1};const r=t.getBoundingClientRect().top-n;return r&&window.scrollBy({left:0,top:r}),e.current={elem:null,top:0},{restored:0!==r}}),[]);return(0,r.useMemo)((()=>({save:t,restore:n})),[n,t])}(),n=(0,r.useRef)(void 0),a=(0,r.useCallback)((r=>{t.save(r),e.disableScrollEvents(),n.current=()=>{const{restored:r}=t.restore();if(n.current=void 0,r){const t=()=>{e.enableScrollEvents(),window.removeEventListener("scroll",t)};window.addEventListener("scroll",t)}else e.enableScrollEvents()}}),[e,t]);return(0,i.Z)((()=>{queueMicrotask((()=>n.current?.()))})),{blockElementScrollPositionUntilNextRender:a}}function h(){const e=(0,r.useRef)(null),t=(0,o.Z)()&&"smooth"===getComputedStyle(document.documentElement).scrollBehavior;return{startScroll:n=>{e.current=t?function(e){return window.scrollTo({top:e,behavior:"smooth"}),()=>{}}(n):function(e){let t=null;const n=document.documentElement.scrollTop>e;return function r(){const a=document.documentElement.scrollTop;(n&&a>e||!n&&at&&cancelAnimationFrame(t)}(n)},cancelScroll:()=>e.current?.()}}},43320:(e,t,n)=>{"use strict";n.d(t,{HX:()=>i,_q:()=>l,os:()=>s});var r=n(80143),a=n(52263),o=n(60373);const i="default";function s(e,t){return`docs-${e}-${t}`}function l(){const{i18n:e}=(0,a.Z)(),t=(0,r._r)(),n=(0,r.WS)(),l=(0,o.Oh)();const c=[i,...Object.keys(t).map((function(e){const r=n?.activePlugin.pluginId===e?n.activeVersion:void 0,a=l[e],o=t[e].versions.find((e=>e.isLast));return s(e,(r??a??o).name)}))];return{locale:e.currentLocale,tags:c}}},50012:(e,t,n)=>{"use strict";n.d(t,{Nk:()=>u,WA:()=>c});var r=n(67294);const a="localStorage";function o(e){let{key:t,oldValue:n,newValue:r,storage:a}=e;if(n===r)return;const o=document.createEvent("StorageEvent");o.initStorageEvent("storage",!1,!1,t,n,r,window.location.href,a),window.dispatchEvent(o)}function i(e){if(void 0===e&&(e=a),"undefined"==typeof window)throw new Error("Browser storage is not available on Node.js/Docusaurus SSR process.");if("none"===e)return null;try{return window[e]}catch(n){return t=n,s||(console.warn("Docusaurus browser storage is not available.\nPossible reasons: running Docusaurus in an iframe, in an incognito browser session, or using too strict browser privacy settings.",t),s=!0),null}var t}let s=!1;const l={get:()=>null,set:()=>{},del:()=>{},listen:()=>()=>{}};function c(e,t){if("undefined"==typeof window)return function(e){function t(){throw new Error(`Illegal storage API usage for storage key "${e}".\nDocusaurus storage APIs are not supposed to be called on the server-rendering process.\nPlease only call storage APIs in effects and event handlers.`)}return{get:t,set:t,del:t,listen:t}}(e);const n=i(t?.persistence);return null===n?l:{get:()=>{try{return n.getItem(e)}catch(t){return console.error(`Docusaurus storage error, can't get key=${e}`,t),null}},set:t=>{try{const r=n.getItem(e);n.setItem(e,t),o({key:e,oldValue:r,newValue:t,storage:n})}catch(r){console.error(`Docusaurus storage error, can't set ${e}=${t}`,r)}},del:()=>{try{const t=n.getItem(e);n.removeItem(e),o({key:e,oldValue:t,newValue:null,storage:n})}catch(t){console.error(`Docusaurus storage error, can't delete key=${e}`,t)}},listen:t=>{try{const r=r=>{r.storageArea===n&&r.key===e&&t(r)};return window.addEventListener("storage",r),()=>window.removeEventListener("storage",r)}catch(r){return console.error(`Docusaurus storage error, can't listen for changes of key=${e}`,r),()=>{}}}}}function u(e,t){const n=(0,r.useRef)((()=>null===e?l:c(e,t))).current(),a=(0,r.useCallback)((e=>"undefined"==typeof window?()=>{}:n.listen(e)),[n]);return[(0,r.useSyncExternalStore)(a,(()=>"undefined"==typeof window?null:n.get()),(()=>null)),n]}},94711:(e,t,n)=>{"use strict";n.d(t,{l:()=>i});var r=n(52263),a=n(16550),o=n(18780);function i(){const{siteConfig:{baseUrl:e,url:t,trailingSlash:n},i18n:{defaultLocale:i,currentLocale:s}}=(0,r.Z)(),{pathname:l}=(0,a.TH)(),c=(0,o.applyTrailingSlash)(l,{trailingSlash:n,baseUrl:e}),u=s===i?e:e.replace(`/${s}/`,"/"),d=c.replace(e,"");return{createUrl:function(e){let{locale:n,fullyQualified:r}=e;return`${r?t:""}${function(e){return e===i?`${u}`:`${u}${e}/`}(n)}${d}`}}}},85936:(e,t,n)=>{"use strict";n.d(t,{S:()=>i});var r=n(67294),a=n(16550),o=n(902);function i(e){const t=(0,a.TH)(),n=(0,o.D9)(t),i=(0,o.zX)(e);(0,r.useEffect)((()=>{n&&t!==n&&i({location:t,previousLocation:n})}),[i,t,n])}},86668:(e,t,n)=>{"use strict";n.d(t,{L:()=>a});var r=n(52263);function a(){return(0,r.Z)().siteConfig.themeConfig}},6278:(e,t,n)=>{"use strict";n.d(t,{L:()=>a});var r=n(52263);function a(){const{siteConfig:{themeConfig:e}}=(0,r.Z)();return e}},239:(e,t,n)=>{"use strict";n.d(t,{l:()=>s});var r=n(67294),a=n(98022),o=n(44996),i=n(6278);function s(){const{withBaseUrl:e}=(0,o.C)(),{algolia:{externalUrlRegex:t,replaceSearchResultPathname:n}}=(0,i.L)();return(0,r.useCallback)((r=>{const o=new URL(r);if((0,a.F)(t,o.href))return r;const i=`${o.pathname+o.hash}`;return e(function(e,t){return t?e.replaceAll(new RegExp(t.from,"g"),t.to):e}(i,n))}),[e,t,n])}},8802:(e,t)=>{"use strict";Object.defineProperty(t,"__esModule",{value:!0}),t.default=function(e,t){const{trailingSlash:n,baseUrl:r}=t;if(e.startsWith("#"))return e;if(void 0===n)return e;const[a]=e.split(/[#?]/),o="/"===a||a===r?a:(i=a,n?function(e){return e.endsWith("/")?e:`${e}/`}(i):function(e){return e.endsWith("/")?e.slice(0,-1):e}(i));var i;return e.replace(a,o)}},54143:(e,t)=>{"use strict";Object.defineProperty(t,"__esModule",{value:!0}),t.getErrorCausalChain=void 0,t.getErrorCausalChain=function e(t){return t.cause?[t,...e(t.cause)]:[t]}},18780:function(e,t,n){"use strict";var r=this&&this.__importDefault||function(e){return e&&e.__esModule?e:{default:e}};Object.defineProperty(t,"__esModule",{value:!0}),t.getErrorCausalChain=t.applyTrailingSlash=t.blogPostContainerID=void 0,t.blogPostContainerID="__blog-post-container";var a=n(8802);Object.defineProperty(t,"applyTrailingSlash",{enumerable:!0,get:function(){return r(a).default}});var o=n(54143);Object.defineProperty(t,"getErrorCausalChain",{enumerable:!0,get:function(){return o.getErrorCausalChain}})},99318:(e,t,n)=>{"use strict";n.d(t,{lX:()=>w,q_:()=>C,ob:()=>f,PP:()=>A,Ep:()=>p});var r=n(87462);function a(e){return"/"===e.charAt(0)}function o(e,t){for(var n=t,r=n+1,a=e.length;r=0;p--){var f=i[p];"."===f?o(i,p):".."===f?(o(i,p),d++):d&&(o(i,p),d--)}if(!c)for(;d--;d)i.unshift("..");!c||""===i[0]||i[0]&&a(i[0])||i.unshift("");var g=i.join("/");return n&&"/"!==g.substr(-1)&&(g+="/"),g};var s=n(38776);function l(e){return"/"===e.charAt(0)?e:"/"+e}function c(e){return"/"===e.charAt(0)?e.substr(1):e}function u(e,t){return function(e,t){return 0===e.toLowerCase().indexOf(t.toLowerCase())&&-1!=="/?#".indexOf(e.charAt(t.length))}(e,t)?e.substr(t.length):e}function d(e){return"/"===e.charAt(e.length-1)?e.slice(0,-1):e}function p(e){var t=e.pathname,n=e.search,r=e.hash,a=t||"/";return n&&"?"!==n&&(a+="?"===n.charAt(0)?n:"?"+n),r&&"#"!==r&&(a+="#"===r.charAt(0)?r:"#"+r),a}function f(e,t,n,a){var o;"string"==typeof e?(o=function(e){var t=e||"/",n="",r="",a=t.indexOf("#");-1!==a&&(r=t.substr(a),t=t.substr(0,a));var o=t.indexOf("?");return-1!==o&&(n=t.substr(o),t=t.substr(0,o)),{pathname:t,search:"?"===n?"":n,hash:"#"===r?"":r}}(e),o.state=t):(void 0===(o=(0,r.Z)({},e)).pathname&&(o.pathname=""),o.search?"?"!==o.search.charAt(0)&&(o.search="?"+o.search):o.search="",o.hash?"#"!==o.hash.charAt(0)&&(o.hash="#"+o.hash):o.hash="",void 0!==t&&void 0===o.state&&(o.state=t));try{o.pathname=decodeURI(o.pathname)}catch(s){throw s instanceof URIError?new URIError('Pathname "'+o.pathname+'" could not be decoded. This is likely caused by an invalid percent-encoding.'):s}return n&&(o.key=n),a?o.pathname?"/"!==o.pathname.charAt(0)&&(o.pathname=i(o.pathname,a.pathname)):o.pathname=a.pathname:o.pathname||(o.pathname="/"),o}function g(){var e=null;var t=[];return{setPrompt:function(t){return e=t,function(){e===t&&(e=null)}},confirmTransitionTo:function(t,n,r,a){if(null!=e){var o="function"==typeof e?e(t,n):e;"string"==typeof o?"function"==typeof r?r(o,a):a(!0):a(!1!==o)}else a(!0)},appendListener:function(e){var n=!0;function r(){n&&e.apply(void 0,arguments)}return t.push(r),function(){n=!1,t=t.filter((function(e){return e!==r}))}},notifyListeners:function(){for(var e=arguments.length,n=new Array(e),r=0;rt?n.splice(t,n.length-t,a):n.push(a),d({action:r,location:a,index:t,entries:n})}}))},replace:function(e,t){var r="REPLACE",a=f(e,t,h(),w.location);u.confirmTransitionTo(a,r,n,(function(e){e&&(w.entries[w.index]=a,d({action:r,location:a}))}))},go:v,goBack:function(){v(-1)},goForward:function(){v(1)},canGo:function(e){var t=w.index+e;return t>=0&&t{"use strict";var r=n(59864),a={childContextTypes:!0,contextType:!0,contextTypes:!0,defaultProps:!0,displayName:!0,getDefaultProps:!0,getDerivedStateFromError:!0,getDerivedStateFromProps:!0,mixins:!0,propTypes:!0,type:!0},o={name:!0,length:!0,prototype:!0,caller:!0,callee:!0,arguments:!0,arity:!0},i={$$typeof:!0,compare:!0,defaultProps:!0,displayName:!0,propTypes:!0,type:!0},s={};function l(e){return r.isMemo(e)?i:s[e.$$typeof]||a}s[r.ForwardRef]={$$typeof:!0,render:!0,defaultProps:!0,displayName:!0,propTypes:!0},s[r.Memo]=i;var c=Object.defineProperty,u=Object.getOwnPropertyNames,d=Object.getOwnPropertySymbols,p=Object.getOwnPropertyDescriptor,f=Object.getPrototypeOf,g=Object.prototype;e.exports=function e(t,n,r){if("string"!=typeof n){if(g){var a=f(n);a&&a!==g&&e(t,a,r)}var i=u(n);d&&(i=i.concat(d(n)));for(var s=l(t),h=l(n),m=0;m{"use strict";e.exports=function(e,t,n,r,a,o,i,s){if(!e){var l;if(void 0===t)l=new Error("Minified exception occurred; use the non-minified dev environment for the full error message and additional helpful warnings.");else{var c=[n,r,a,o,i,s],u=0;(l=new Error(t.replace(/%s/g,(function(){return c[u++]})))).name="Invariant Violation"}throw l.framesToPop=1,l}}},5826:e=>{e.exports=Array.isArray||function(e){return"[object Array]"==Object.prototype.toString.call(e)}},7439:(e,t,n)=>{"use strict";n.r(t)},32497:(e,t,n)=>{"use strict";n.r(t)},29268:(e,t,n)=>{"use strict";n.r(t)},74865:function(e,t,n){var r,a;r=function(){var e,t,n={version:"0.2.0"},r=n.settings={minimum:.08,easing:"ease",positionUsing:"",speed:200,trickle:!0,trickleRate:.02,trickleSpeed:800,showSpinner:!0,barSelector:'[role="bar"]',spinnerSelector:'[role="spinner"]',parent:"body",template:'
    '};function a(e,t,n){return en?n:e}function o(e){return 100*(-1+e)}function i(e,t,n){var a;return(a="translate3d"===r.positionUsing?{transform:"translate3d("+o(e)+"%,0,0)"}:"translate"===r.positionUsing?{transform:"translate("+o(e)+"%,0)"}:{"margin-left":o(e)+"%"}).transition="all "+t+"ms "+n,a}n.configure=function(e){var t,n;for(t in e)void 0!==(n=e[t])&&e.hasOwnProperty(t)&&(r[t]=n);return this},n.status=null,n.set=function(e){var t=n.isStarted();e=a(e,r.minimum,1),n.status=1===e?null:e;var o=n.render(!t),c=o.querySelector(r.barSelector),u=r.speed,d=r.easing;return o.offsetWidth,s((function(t){""===r.positionUsing&&(r.positionUsing=n.getPositioningCSS()),l(c,i(e,u,d)),1===e?(l(o,{transition:"none",opacity:1}),o.offsetWidth,setTimeout((function(){l(o,{transition:"all "+u+"ms linear",opacity:0}),setTimeout((function(){n.remove(),t()}),u)}),u)):setTimeout(t,u)})),this},n.isStarted=function(){return"number"==typeof n.status},n.start=function(){n.status||n.set(0);var e=function(){setTimeout((function(){n.status&&(n.trickle(),e())}),r.trickleSpeed)};return r.trickle&&e(),this},n.done=function(e){return e||n.status?n.inc(.3+.5*Math.random()).set(1):this},n.inc=function(e){var t=n.status;return t?("number"!=typeof e&&(e=(1-t)*a(Math.random()*t,.1,.95)),t=a(t+e,0,.994),n.set(t)):n.start()},n.trickle=function(){return n.inc(Math.random()*r.trickleRate)},e=0,t=0,n.promise=function(r){return r&&"resolved"!==r.state()?(0===t&&n.start(),e++,t++,r.always((function(){0==--t?(e=0,n.done()):n.set((e-t)/e)})),this):this},n.render=function(e){if(n.isRendered())return document.getElementById("nprogress");u(document.documentElement,"nprogress-busy");var t=document.createElement("div");t.id="nprogress",t.innerHTML=r.template;var a,i=t.querySelector(r.barSelector),s=e?"-100":o(n.status||0),c=document.querySelector(r.parent);return l(i,{transition:"all 0 linear",transform:"translate3d("+s+"%,0,0)"}),r.showSpinner||(a=t.querySelector(r.spinnerSelector))&&f(a),c!=document.body&&u(c,"nprogress-custom-parent"),c.appendChild(t),t},n.remove=function(){d(document.documentElement,"nprogress-busy"),d(document.querySelector(r.parent),"nprogress-custom-parent");var e=document.getElementById("nprogress");e&&f(e)},n.isRendered=function(){return!!document.getElementById("nprogress")},n.getPositioningCSS=function(){var e=document.body.style,t="WebkitTransform"in e?"Webkit":"MozTransform"in e?"Moz":"msTransform"in e?"ms":"OTransform"in e?"O":"";return t+"Perspective"in e?"translate3d":t+"Transform"in e?"translate":"margin"};var s=function(){var e=[];function t(){var n=e.shift();n&&n(t)}return function(n){e.push(n),1==e.length&&t()}}(),l=function(){var e=["Webkit","O","Moz","ms"],t={};function n(e){return e.replace(/^-ms-/,"ms-").replace(/-([\da-z])/gi,(function(e,t){return t.toUpperCase()}))}function r(t){var n=document.body.style;if(t in n)return t;for(var r,a=e.length,o=t.charAt(0).toUpperCase()+t.slice(1);a--;)if((r=e[a]+o)in n)return r;return t}function a(e){return e=n(e),t[e]||(t[e]=r(e))}function o(e,t,n){t=a(t),e.style[t]=n}return function(e,t){var n,r,a=arguments;if(2==a.length)for(n in t)void 0!==(r=t[n])&&t.hasOwnProperty(n)&&o(e,n,r);else o(e,a[1],a[2])}}();function c(e,t){return("string"==typeof e?e:p(e)).indexOf(" "+t+" ")>=0}function u(e,t){var n=p(e),r=n+t;c(n,t)||(e.className=r.substring(1))}function d(e,t){var n,r=p(e);c(e,t)&&(n=r.replace(" "+t+" "," "),e.className=n.substring(1,n.length-1))}function p(e){return(" "+(e.className||"")+" ").replace(/\s+/gi," ")}function f(e){e&&e.parentNode&&e.parentNode.removeChild(e)}return n},void 0===(a="function"==typeof r?r.call(t,n,t,e):r)||(e.exports=a)},85795:()=>{Prism.languages.ada={comment:/--.*/,string:/"(?:""|[^"\r\f\n])*"/,number:[{pattern:/\b\d(?:_?\d)*#[\dA-F](?:_?[\dA-F])*(?:\.[\dA-F](?:_?[\dA-F])*)?#(?:E[+-]?\d(?:_?\d)*)?/i},{pattern:/\b\d(?:_?\d)*(?:\.\d(?:_?\d)*)?(?:E[+-]?\d(?:_?\d)*)?\b/i}],attribute:{pattern:/\b'\w+/,alias:"attr-name"},keyword:/\b(?:abort|abs|abstract|accept|access|aliased|all|and|array|at|begin|body|case|constant|declare|delay|delta|digits|do|else|elsif|end|entry|exception|exit|for|function|generic|goto|if|in|interface|is|limited|loop|mod|new|not|null|of|or|others|out|overriding|package|pragma|private|procedure|protected|raise|range|record|rem|renames|requeue|return|reverse|select|separate|some|subtype|synchronized|tagged|task|terminate|then|type|until|use|when|while|with|xor)\b/i,boolean:/\b(?:false|true)\b/i,operator:/<[=>]?|>=?|=>?|:=|\/=?|\*\*?|[&+-]/,punctuation:/\.\.?|[,;():]/,char:/'.'/,variable:/\b[a-z](?:\w)*\b/i}},57874:()=>{!function(e){var t="\\b(?:BASH|BASHOPTS|BASH_ALIASES|BASH_ARGC|BASH_ARGV|BASH_CMDS|BASH_COMPLETION_COMPAT_DIR|BASH_LINENO|BASH_REMATCH|BASH_SOURCE|BASH_VERSINFO|BASH_VERSION|COLORTERM|COLUMNS|COMP_WORDBREAKS|DBUS_SESSION_BUS_ADDRESS|DEFAULTS_PATH|DESKTOP_SESSION|DIRSTACK|DISPLAY|EUID|GDMSESSION|GDM_LANG|GNOME_KEYRING_CONTROL|GNOME_KEYRING_PID|GPG_AGENT_INFO|GROUPS|HISTCONTROL|HISTFILE|HISTFILESIZE|HISTSIZE|HOME|HOSTNAME|HOSTTYPE|IFS|INSTANCE|JOB|LANG|LANGUAGE|LC_ADDRESS|LC_ALL|LC_IDENTIFICATION|LC_MEASUREMENT|LC_MONETARY|LC_NAME|LC_NUMERIC|LC_PAPER|LC_TELEPHONE|LC_TIME|LESSCLOSE|LESSOPEN|LINES|LOGNAME|LS_COLORS|MACHTYPE|MAILCHECK|MANDATORY_PATH|NO_AT_BRIDGE|OLDPWD|OPTERR|OPTIND|ORBIT_SOCKETDIR|OSTYPE|PAPERSIZE|PATH|PIPESTATUS|PPID|PS1|PS2|PS3|PS4|PWD|RANDOM|REPLY|SECONDS|SELINUX_INIT|SESSION|SESSIONTYPE|SESSION_MANAGER|SHELL|SHELLOPTS|SHLVL|SSH_AUTH_SOCK|TERM|UID|UPSTART_EVENTS|UPSTART_INSTANCE|UPSTART_JOB|UPSTART_SESSION|USER|WINDOWID|XAUTHORITY|XDG_CONFIG_DIRS|XDG_CURRENT_DESKTOP|XDG_DATA_DIRS|XDG_GREETER_DATA_DIR|XDG_MENU_PREFIX|XDG_RUNTIME_DIR|XDG_SEAT|XDG_SEAT_PATH|XDG_SESSION_DESKTOP|XDG_SESSION_ID|XDG_SESSION_PATH|XDG_SESSION_TYPE|XDG_VTNR|XMODIFIERS)\\b",n={pattern:/(^(["']?)\w+\2)[ \t]+\S.*/,lookbehind:!0,alias:"punctuation",inside:null},r={bash:n,environment:{pattern:RegExp("\\$"+t),alias:"constant"},variable:[{pattern:/\$?\(\([\s\S]+?\)\)/,greedy:!0,inside:{variable:[{pattern:/(^\$\(\([\s\S]+)\)\)/,lookbehind:!0},/^\$\(\(/],number:/\b0x[\dA-Fa-f]+\b|(?:\b\d+(?:\.\d*)?|\B\.\d+)(?:[Ee]-?\d+)?/,operator:/--|\+\+|\*\*=?|<<=?|>>=?|&&|\|\||[=!+\-*/%<>^&|]=?|[?~:]/,punctuation:/\(\(?|\)\)?|,|;/}},{pattern:/\$\((?:\([^)]+\)|[^()])+\)|`[^`]+`/,greedy:!0,inside:{variable:/^\$\(|^`|\)$|`$/}},{pattern:/\$\{[^}]+\}/,greedy:!0,inside:{operator:/:[-=?+]?|[!\/]|##?|%%?|\^\^?|,,?/,punctuation:/[\[\]]/,environment:{pattern:RegExp("(\\{)"+t),lookbehind:!0,alias:"constant"}}},/\$(?:\w+|[#?*!@$])/],entity:/\\(?:[abceEfnrtv\\"]|O?[0-7]{1,3}|U[0-9a-fA-F]{8}|u[0-9a-fA-F]{4}|x[0-9a-fA-F]{1,2})/};e.languages.bash={shebang:{pattern:/^#!\s*\/.*/,alias:"important"},comment:{pattern:/(^|[^"{\\$])#.*/,lookbehind:!0},"function-name":[{pattern:/(\bfunction\s+)[\w-]+(?=(?:\s*\(?:\s*\))?\s*\{)/,lookbehind:!0,alias:"function"},{pattern:/\b[\w-]+(?=\s*\(\s*\)\s*\{)/,alias:"function"}],"for-or-select":{pattern:/(\b(?:for|select)\s+)\w+(?=\s+in\s)/,alias:"variable",lookbehind:!0},"assign-left":{pattern:/(^|[\s;|&]|[<>]\()\w+(?:\.\w+)*(?=\+?=)/,inside:{environment:{pattern:RegExp("(^|[\\s;|&]|[<>]\\()"+t),lookbehind:!0,alias:"constant"}},alias:"variable",lookbehind:!0},parameter:{pattern:/(^|\s)-{1,2}(?:\w+:[+-]?)?\w+(?:\.\w+)*(?=[=\s]|$)/,alias:"variable",lookbehind:!0},string:[{pattern:/((?:^|[^<])<<-?\s*)(\w+)\s[\s\S]*?(?:\r?\n|\r)\2/,lookbehind:!0,greedy:!0,inside:r},{pattern:/((?:^|[^<])<<-?\s*)(["'])(\w+)\2\s[\s\S]*?(?:\r?\n|\r)\3/,lookbehind:!0,greedy:!0,inside:{bash:n}},{pattern:/(^|[^\\](?:\\\\)*)"(?:\\[\s\S]|\$\([^)]+\)|\$(?!\()|`[^`]+`|[^"\\`$])*"/,lookbehind:!0,greedy:!0,inside:r},{pattern:/(^|[^$\\])'[^']*'/,lookbehind:!0,greedy:!0},{pattern:/\$'(?:[^'\\]|\\[\s\S])*'/,greedy:!0,inside:{entity:r.entity}}],environment:{pattern:RegExp("\\$?"+t),alias:"constant"},variable:r.variable,function:{pattern:/(^|[\s;|&]|[<>]\()(?:add|apropos|apt|apt-cache|apt-get|aptitude|aspell|automysqlbackup|awk|basename|bash|bc|bconsole|bg|bzip2|cal|cargo|cat|cfdisk|chgrp|chkconfig|chmod|chown|chroot|cksum|clear|cmp|column|comm|composer|cp|cron|crontab|csplit|curl|cut|date|dc|dd|ddrescue|debootstrap|df|diff|diff3|dig|dir|dircolors|dirname|dirs|dmesg|docker|docker-compose|du|egrep|eject|env|ethtool|expand|expect|expr|fdformat|fdisk|fg|fgrep|file|find|fmt|fold|format|free|fsck|ftp|fuser|gawk|git|gparted|grep|groupadd|groupdel|groupmod|groups|grub-mkconfig|gzip|halt|head|hg|history|host|hostname|htop|iconv|id|ifconfig|ifdown|ifup|import|install|ip|java|jobs|join|kill|killall|less|link|ln|locate|logname|logrotate|look|lpc|lpr|lprint|lprintd|lprintq|lprm|ls|lsof|lynx|make|man|mc|mdadm|mkconfig|mkdir|mke2fs|mkfifo|mkfs|mkisofs|mknod|mkswap|mmv|more|most|mount|mtools|mtr|mutt|mv|nano|nc|netstat|nice|nl|node|nohup|notify-send|npm|nslookup|op|open|parted|passwd|paste|pathchk|ping|pkill|pnpm|podman|podman-compose|popd|pr|printcap|printenv|ps|pushd|pv|quota|quotacheck|quotactl|ram|rar|rcp|reboot|remsync|rename|renice|rev|rm|rmdir|rpm|rsync|scp|screen|sdiff|sed|sendmail|seq|service|sftp|sh|shellcheck|shuf|shutdown|sleep|slocate|sort|split|ssh|stat|strace|su|sudo|sum|suspend|swapon|sync|sysctl|tac|tail|tar|tee|time|timeout|top|touch|tr|traceroute|tsort|tty|umount|uname|unexpand|uniq|units|unrar|unshar|unzip|update-grub|uptime|useradd|userdel|usermod|users|uudecode|uuencode|v|vcpkg|vdir|vi|vim|virsh|vmstat|wait|watch|wc|wget|whereis|which|who|whoami|write|xargs|xdg-open|yarn|yes|zenity|zip|zsh|zypper)(?=$|[)\s;|&])/,lookbehind:!0},keyword:{pattern:/(^|[\s;|&]|[<>]\()(?:case|do|done|elif|else|esac|fi|for|function|if|in|select|then|until|while)(?=$|[)\s;|&])/,lookbehind:!0},builtin:{pattern:/(^|[\s;|&]|[<>]\()(?:\.|:|alias|bind|break|builtin|caller|cd|command|continue|declare|echo|enable|eval|exec|exit|export|getopts|hash|help|let|local|logout|mapfile|printf|pwd|read|readarray|readonly|return|set|shift|shopt|source|test|times|trap|type|typeset|ulimit|umask|unalias|unset)(?=$|[)\s;|&])/,lookbehind:!0,alias:"class-name"},boolean:{pattern:/(^|[\s;|&]|[<>]\()(?:false|true)(?=$|[)\s;|&])/,lookbehind:!0},"file-descriptor":{pattern:/\B&\d\b/,alias:"important"},operator:{pattern:/\d?<>|>\||\+=|=[=~]?|!=?|<<[<-]?|[&\d]?>>|\d[<>]&?|[<>][&=]?|&[>&]?|\|[&|]?/,inside:{"file-descriptor":{pattern:/^\d/,alias:"important"}}},punctuation:/\$?\(\(?|\)\)?|\.\.|[{}[\];\\]/,number:{pattern:/(^|\s)(?:[1-9]\d*|0)(?:[.,]\d+)?\b/,lookbehind:!0}},n.inside=e.languages.bash;for(var a=["comment","function-name","for-or-select","assign-left","parameter","string","environment","function","keyword","builtin","boolean","file-descriptor","operator","punctuation","number"],o=r.variable[1].inside,i=0;i{!function(e){function t(e,t){return e.replace(/<<(\d+)>>/g,(function(e,n){return"(?:"+t[+n]+")"}))}function n(e,n,r){return RegExp(t(e,n),r||"")}function r(e,t){for(var n=0;n>/g,(function(){return"(?:"+e+")"}));return e.replace(/<>/g,"[^\\s\\S]")}var a="bool byte char decimal double dynamic float int long object sbyte short string uint ulong ushort var void",o="class enum interface record struct",i="add alias and ascending async await by descending from(?=\\s*(?:\\w|$)) get global group into init(?=\\s*;) join let nameof not notnull on or orderby partial remove select set unmanaged value when where with(?=\\s*{)",s="abstract as base break case catch checked const continue default delegate do else event explicit extern finally fixed for foreach goto if implicit in internal is lock namespace new null operator out override params private protected public readonly ref return sealed sizeof stackalloc static switch this throw try typeof unchecked unsafe using virtual volatile while yield";function l(e){return"\\b(?:"+e.trim().replace(/ /g,"|")+")\\b"}var c=l(o),u=RegExp(l(a+" "+o+" "+i+" "+s)),d=l(o+" "+i+" "+s),p=l(a+" "+o+" "+s),f=r(/<(?:[^<>;=+\-*/%&|^]|<>)*>/.source,2),g=r(/\((?:[^()]|<>)*\)/.source,2),h=/@?\b[A-Za-z_]\w*\b/.source,m=t(/<<0>>(?:\s*<<1>>)?/.source,[h,f]),b=t(/(?!<<0>>)<<1>>(?:\s*\.\s*<<1>>)*/.source,[d,m]),y=/\[\s*(?:,\s*)*\]/.source,v=t(/<<0>>(?:\s*(?:\?\s*)?<<1>>)*(?:\s*\?)?/.source,[b,y]),w=t(/[^,()<>[\];=+\-*/%&|^]|<<0>>|<<1>>|<<2>>/.source,[f,g,y]),k=t(/\(<<0>>+(?:,<<0>>+)+\)/.source,[w]),x=t(/(?:<<0>>|<<1>>)(?:\s*(?:\?\s*)?<<2>>)*(?:\s*\?)?/.source,[k,b,y]),S={keyword:u,punctuation:/[<>()?,.:[\]]/},_=/'(?:[^\r\n'\\]|\\.|\\[Uux][\da-fA-F]{1,8})'/.source,E=/"(?:\\.|[^\\"\r\n])*"/.source,C=/@"(?:""|\\[\s\S]|[^\\"])*"(?!")/.source;e.languages.csharp=e.languages.extend("clike",{string:[{pattern:n(/(^|[^$\\])<<0>>/.source,[C]),lookbehind:!0,greedy:!0},{pattern:n(/(^|[^@$\\])<<0>>/.source,[E]),lookbehind:!0,greedy:!0}],"class-name":[{pattern:n(/(\busing\s+static\s+)<<0>>(?=\s*;)/.source,[b]),lookbehind:!0,inside:S},{pattern:n(/(\busing\s+<<0>>\s*=\s*)<<1>>(?=\s*;)/.source,[h,x]),lookbehind:!0,inside:S},{pattern:n(/(\busing\s+)<<0>>(?=\s*=)/.source,[h]),lookbehind:!0},{pattern:n(/(\b<<0>>\s+)<<1>>/.source,[c,m]),lookbehind:!0,inside:S},{pattern:n(/(\bcatch\s*\(\s*)<<0>>/.source,[b]),lookbehind:!0,inside:S},{pattern:n(/(\bwhere\s+)<<0>>/.source,[h]),lookbehind:!0},{pattern:n(/(\b(?:is(?:\s+not)?|as)\s+)<<0>>/.source,[v]),lookbehind:!0,inside:S},{pattern:n(/\b<<0>>(?=\s+(?!<<1>>|with\s*\{)<<2>>(?:\s*[=,;:{)\]]|\s+(?:in|when)\b))/.source,[x,p,h]),inside:S}],keyword:u,number:/(?:\b0(?:x[\da-f_]*[\da-f]|b[01_]*[01])|(?:\B\.\d+(?:_+\d+)*|\b\d+(?:_+\d+)*(?:\.\d+(?:_+\d+)*)?)(?:e[-+]?\d+(?:_+\d+)*)?)(?:[dflmu]|lu|ul)?\b/i,operator:/>>=?|<<=?|[-=]>|([-+&|])\1|~|\?\?=?|[-+*/%&|^!=<>]=?/,punctuation:/\?\.?|::|[{}[\];(),.:]/}),e.languages.insertBefore("csharp","number",{range:{pattern:/\.\./,alias:"operator"}}),e.languages.insertBefore("csharp","punctuation",{"named-parameter":{pattern:n(/([(,]\s*)<<0>>(?=\s*:)/.source,[h]),lookbehind:!0,alias:"punctuation"}}),e.languages.insertBefore("csharp","class-name",{namespace:{pattern:n(/(\b(?:namespace|using)\s+)<<0>>(?:\s*\.\s*<<0>>)*(?=\s*[;{])/.source,[h]),lookbehind:!0,inside:{punctuation:/\./}},"type-expression":{pattern:n(/(\b(?:default|sizeof|typeof)\s*\(\s*(?!\s))(?:[^()\s]|\s(?!\s)|<<0>>)*(?=\s*\))/.source,[g]),lookbehind:!0,alias:"class-name",inside:S},"return-type":{pattern:n(/<<0>>(?=\s+(?:<<1>>\s*(?:=>|[({]|\.\s*this\s*\[)|this\s*\[))/.source,[x,b]),inside:S,alias:"class-name"},"constructor-invocation":{pattern:n(/(\bnew\s+)<<0>>(?=\s*[[({])/.source,[x]),lookbehind:!0,inside:S,alias:"class-name"},"generic-method":{pattern:n(/<<0>>\s*<<1>>(?=\s*\()/.source,[h,f]),inside:{function:n(/^<<0>>/.source,[h]),generic:{pattern:RegExp(f),alias:"class-name",inside:S}}},"type-list":{pattern:n(/\b((?:<<0>>\s+<<1>>|record\s+<<1>>\s*<<5>>|where\s+<<2>>)\s*:\s*)(?:<<3>>|<<4>>|<<1>>\s*<<5>>|<<6>>)(?:\s*,\s*(?:<<3>>|<<4>>|<<6>>))*(?=\s*(?:where|[{;]|=>|$))/.source,[c,m,h,x,u.source,g,/\bnew\s*\(\s*\)/.source]),lookbehind:!0,inside:{"record-arguments":{pattern:n(/(^(?!new\s*\()<<0>>\s*)<<1>>/.source,[m,g]),lookbehind:!0,greedy:!0,inside:e.languages.csharp},keyword:u,"class-name":{pattern:RegExp(x),greedy:!0,inside:S},punctuation:/[,()]/}},preprocessor:{pattern:/(^[\t ]*)#.*/m,lookbehind:!0,alias:"property",inside:{directive:{pattern:/(#)\b(?:define|elif|else|endif|endregion|error|if|line|nullable|pragma|region|undef|warning)\b/,lookbehind:!0,alias:"keyword"}}}});var T=E+"|"+_,A=t(/\/(?![*/])|\/\/[^\r\n]*[\r\n]|\/\*(?:[^*]|\*(?!\/))*\*\/|<<0>>/.source,[T]),j=r(t(/[^"'/()]|<<0>>|\(<>*\)/.source,[A]),2),N=/\b(?:assembly|event|field|method|module|param|property|return|type)\b/.source,L=t(/<<0>>(?:\s*\(<<1>>*\))?/.source,[b,j]);e.languages.insertBefore("csharp","class-name",{attribute:{pattern:n(/((?:^|[^\s\w>)?])\s*\[\s*)(?:<<0>>\s*:\s*)?<<1>>(?:\s*,\s*<<1>>)*(?=\s*\])/.source,[N,L]),lookbehind:!0,greedy:!0,inside:{target:{pattern:n(/^<<0>>(?=\s*:)/.source,[N]),alias:"keyword"},"attribute-arguments":{pattern:n(/\(<<0>>*\)/.source,[j]),inside:e.languages.csharp},"class-name":{pattern:RegExp(b),inside:{punctuation:/\./}},punctuation:/[:,]/}}});var P=/:[^}\r\n]+/.source,I=r(t(/[^"'/()]|<<0>>|\(<>*\)/.source,[A]),2),R=t(/\{(?!\{)(?:(?![}:])<<0>>)*<<1>>?\}/.source,[I,P]),O=r(t(/[^"'/()]|\/(?!\*)|\/\*(?:[^*]|\*(?!\/))*\*\/|<<0>>|\(<>*\)/.source,[T]),2),F=t(/\{(?!\{)(?:(?![}:])<<0>>)*<<1>>?\}/.source,[O,P]);function M(t,r){return{interpolation:{pattern:n(/((?:^|[^{])(?:\{\{)*)<<0>>/.source,[t]),lookbehind:!0,inside:{"format-string":{pattern:n(/(^\{(?:(?![}:])<<0>>)*)<<1>>(?=\}$)/.source,[r,P]),lookbehind:!0,inside:{punctuation:/^:/}},punctuation:/^\{|\}$/,expression:{pattern:/[\s\S]+/,alias:"language-csharp",inside:e.languages.csharp}}},string:/[\s\S]+/}}e.languages.insertBefore("csharp","string",{"interpolation-string":[{pattern:n(/(^|[^\\])(?:\$@|@\$)"(?:""|\\[\s\S]|\{\{|<<0>>|[^\\{"])*"/.source,[R]),lookbehind:!0,greedy:!0,inside:M(R,I)},{pattern:n(/(^|[^@\\])\$"(?:\\.|\{\{|<<0>>|[^\\"{])*"/.source,[F]),lookbehind:!0,greedy:!0,inside:M(F,O)}],char:{pattern:RegExp(_),greedy:!0}}),e.languages.dotnet=e.languages.cs=e.languages.csharp}(Prism)},60397:()=>{!function(e){var t="(?:"+[/[a-zA-Z_\x80-\uFFFF][\w\x80-\uFFFF]*/.source,/-?(?:\.\d+|\d+(?:\.\d*)?)/.source,/"[^"\\]*(?:\\[\s\S][^"\\]*)*"/.source,/<(?:[^<>]|(?!)*>/.source].join("|")+")",n={markup:{pattern:/(^<)[\s\S]+(?=>$)/,lookbehind:!0,alias:["language-markup","language-html","language-xml"],inside:e.languages.markup}};function r(e,n){return RegExp(e.replace(//g,(function(){return t})),n)}e.languages.dot={comment:{pattern:/\/\/.*|\/\*[\s\S]*?\*\/|^#.*/m,greedy:!0},"graph-name":{pattern:r(/(\b(?:digraph|graph|subgraph)[ \t\r\n]+)/.source,"i"),lookbehind:!0,greedy:!0,alias:"class-name",inside:n},"attr-value":{pattern:r(/(=[ \t\r\n]*)/.source),lookbehind:!0,greedy:!0,inside:n},"attr-name":{pattern:r(/([\[;, \t\r\n])(?=[ \t\r\n]*=)/.source),lookbehind:!0,greedy:!0,inside:n},keyword:/\b(?:digraph|edge|graph|node|strict|subgraph)\b/i,"compass-point":{pattern:/(:[ \t\r\n]*)(?:[ewc_]|[ns][ew]?)(?![\w\x80-\uFFFF])/,lookbehind:!0,alias:"builtin"},node:{pattern:r(/(^|[^-.\w\x80-\uFFFF\\])/.source),lookbehind:!0,greedy:!0,inside:n},operator:/[=:]|-[->]/,punctuation:/[\[\]{};,]/},e.languages.gv=e.languages.dot}(Prism)},81295:()=>{Prism.languages.haskell={comment:{pattern:/(^|[^-!#$%*+=?&@|~.:<>^\\\/])(?:--(?:(?=.)[^-!#$%*+=?&@|~.:<>^\\\/].*|$)|\{-[\s\S]*?-\})/m,lookbehind:!0},char:{pattern:/'(?:[^\\']|\\(?:[abfnrtv\\"'&]|\^[A-Z@[\]^_]|ACK|BEL|BS|CAN|CR|DC1|DC2|DC3|DC4|DEL|DLE|EM|ENQ|EOT|ESC|ETB|ETX|FF|FS|GS|HT|LF|NAK|NUL|RS|SI|SO|SOH|SP|STX|SUB|SYN|US|VT|\d+|o[0-7]+|x[0-9a-fA-F]+))'/,alias:"string"},string:{pattern:/"(?:[^\\"]|\\(?:\S|\s+\\))*"/,greedy:!0},keyword:/\b(?:case|class|data|deriving|do|else|if|in|infixl|infixr|instance|let|module|newtype|of|primitive|then|type|where)\b/,"import-statement":{pattern:/(^[\t ]*)import\s+(?:qualified\s+)?(?:[A-Z][\w']*)(?:\.[A-Z][\w']*)*(?:\s+as\s+(?:[A-Z][\w']*)(?:\.[A-Z][\w']*)*)?(?:\s+hiding\b)?/m,lookbehind:!0,inside:{keyword:/\b(?:as|hiding|import|qualified)\b/,punctuation:/\./}},builtin:/\b(?:abs|acos|acosh|all|and|any|appendFile|approxRational|asTypeOf|asin|asinh|atan|atan2|atanh|basicIORun|break|catch|ceiling|chr|compare|concat|concatMap|const|cos|cosh|curry|cycle|decodeFloat|denominator|digitToInt|div|divMod|drop|dropWhile|either|elem|encodeFloat|enumFrom|enumFromThen|enumFromThenTo|enumFromTo|error|even|exp|exponent|fail|filter|flip|floatDigits|floatRadix|floatRange|floor|fmap|foldl|foldl1|foldr|foldr1|fromDouble|fromEnum|fromInt|fromInteger|fromIntegral|fromRational|fst|gcd|getChar|getContents|getLine|group|head|id|inRange|index|init|intToDigit|interact|ioError|isAlpha|isAlphaNum|isAscii|isControl|isDenormalized|isDigit|isHexDigit|isIEEE|isInfinite|isLower|isNaN|isNegativeZero|isOctDigit|isPrint|isSpace|isUpper|iterate|last|lcm|length|lex|lexDigits|lexLitChar|lines|log|logBase|lookup|map|mapM|mapM_|max|maxBound|maximum|maybe|min|minBound|minimum|mod|negate|not|notElem|null|numerator|odd|or|ord|otherwise|pack|pi|pred|primExitWith|print|product|properFraction|putChar|putStr|putStrLn|quot|quotRem|range|rangeSize|read|readDec|readFile|readFloat|readHex|readIO|readInt|readList|readLitChar|readLn|readOct|readParen|readSigned|reads|readsPrec|realToFrac|recip|rem|repeat|replicate|return|reverse|round|scaleFloat|scanl|scanl1|scanr|scanr1|seq|sequence|sequence_|show|showChar|showInt|showList|showLitChar|showParen|showSigned|showString|shows|showsPrec|significand|signum|sin|sinh|snd|sort|span|splitAt|sqrt|subtract|succ|sum|tail|take|takeWhile|tan|tanh|threadToIOResult|toEnum|toInt|toInteger|toLower|toRational|toUpper|truncate|uncurry|undefined|unlines|until|unwords|unzip|unzip3|userError|words|writeFile|zip|zip3|zipWith|zipWith3)\b/,number:/\b(?:\d+(?:\.\d+)?(?:e[+-]?\d+)?|0o[0-7]+|0x[0-9a-f]+)\b/i,operator:[{pattern:/`(?:[A-Z][\w']*\.)*[_a-z][\w']*`/,greedy:!0},{pattern:/(\s)\.(?=\s)/,lookbehind:!0},/[-!#$%*+=?&@|~:<>^\\\/][-!#$%*+=?&@|~.:<>^\\\/]*|\.[-!#$%*+=?&@|~.:<>^\\\/]+/],hvariable:{pattern:/\b(?:[A-Z][\w']*\.)*[_a-z][\w']*/,inside:{punctuation:/\./}},constant:{pattern:/\b(?:[A-Z][\w']*\.)*[A-Z][\w']*/,inside:{punctuation:/\./}},punctuation:/[{}[\];(),.:]/},Prism.languages.hs=Prism.languages.haskell},52503:()=>{!function(e){var t=/\b(?:abstract|assert|boolean|break|byte|case|catch|char|class|const|continue|default|do|double|else|enum|exports|extends|final|finally|float|for|goto|if|implements|import|instanceof|int|interface|long|module|native|new|non-sealed|null|open|opens|package|permits|private|protected|provides|public|record(?!\s*[(){}[\]<>=%~.:,;?+\-*/&|^])|requires|return|sealed|short|static|strictfp|super|switch|synchronized|this|throw|throws|to|transient|transitive|try|uses|var|void|volatile|while|with|yield)\b/,n=/(?:[a-z]\w*\s*\.\s*)*(?:[A-Z]\w*\s*\.\s*)*/.source,r={pattern:RegExp(/(^|[^\w.])/.source+n+/[A-Z](?:[\d_A-Z]*[a-z]\w*)?\b/.source),lookbehind:!0,inside:{namespace:{pattern:/^[a-z]\w*(?:\s*\.\s*[a-z]\w*)*(?:\s*\.)?/,inside:{punctuation:/\./}},punctuation:/\./}};e.languages.java=e.languages.extend("clike",{string:{pattern:/(^|[^\\])"(?:\\.|[^"\\\r\n])*"/,lookbehind:!0,greedy:!0},"class-name":[r,{pattern:RegExp(/(^|[^\w.])/.source+n+/[A-Z]\w*(?=\s+\w+\s*[;,=()]|\s*(?:\[[\s,]*\]\s*)?::\s*new\b)/.source),lookbehind:!0,inside:r.inside},{pattern:RegExp(/(\b(?:class|enum|extends|implements|instanceof|interface|new|record|throws)\s+)/.source+n+/[A-Z]\w*\b/.source),lookbehind:!0,inside:r.inside}],keyword:t,function:[e.languages.clike.function,{pattern:/(::\s*)[a-z_]\w*/,lookbehind:!0}],number:/\b0b[01][01_]*L?\b|\b0x(?:\.[\da-f_p+-]+|[\da-f_]+(?:\.[\da-f_p+-]+)?)\b|(?:\b\d[\d_]*(?:\.[\d_]*)?|\B\.\d[\d_]*)(?:e[+-]?\d[\d_]*)?[dfl]?/i,operator:{pattern:/(^|[^.])(?:<<=?|>>>?=?|->|--|\+\+|&&|\|\||::|[?:~]|[-+*/%&|^!=<>]=?)/m,lookbehind:!0},constant:/\b[A-Z][A-Z_\d]+\b/}),e.languages.insertBefore("java","string",{"triple-quoted-string":{pattern:/"""[ \t]*[\r\n](?:(?:"|"")?(?:\\.|[^"\\]))*"""/,greedy:!0,alias:"string"},char:{pattern:/'(?:\\.|[^'\\\r\n]){1,6}'/,greedy:!0}}),e.languages.insertBefore("java","class-name",{annotation:{pattern:/(^|[^.])@\w+(?:\s*\.\s*\w+)*/,lookbehind:!0,alias:"punctuation"},generics:{pattern:/<(?:[\w\s,.?]|&(?!&)|<(?:[\w\s,.?]|&(?!&)|<(?:[\w\s,.?]|&(?!&)|<(?:[\w\s,.?]|&(?!&))*>)*>)*>)*>/,inside:{"class-name":r,keyword:t,punctuation:/[<>(),.:]/,operator:/[?&|]/}},import:[{pattern:RegExp(/(\bimport\s+)/.source+n+/(?:[A-Z]\w*|\*)(?=\s*;)/.source),lookbehind:!0,inside:{namespace:r.inside.namespace,punctuation:/\./,operator:/\*/,"class-name":/\w+/}},{pattern:RegExp(/(\bimport\s+static\s+)/.source+n+/(?:\w+|\*)(?=\s*;)/.source),lookbehind:!0,alias:"static",inside:{namespace:r.inside.namespace,static:/\b\w+$/,punctuation:/\./,operator:/\*/,"class-name":/\w+/}}],namespace:{pattern:RegExp(/(\b(?:exports|import(?:\s+static)?|module|open|opens|package|provides|requires|to|transitive|uses|with)\s+)(?!)[a-z]\w*(?:\.[a-z]\w*)*\.?/.source.replace(//g,(function(){return t.source}))),lookbehind:!0,inside:{punctuation:/\./}}})}(Prism)},96854:()=>{!function(e){function t(e,t){return"___"+e.toUpperCase()+t+"___"}Object.defineProperties(e.languages["markup-templating"]={},{buildPlaceholders:{value:function(n,r,a,o){if(n.language===r){var i=n.tokenStack=[];n.code=n.code.replace(a,(function(e){if("function"==typeof o&&!o(e))return e;for(var a,s=i.length;-1!==n.code.indexOf(a=t(r,s));)++s;return i[s]=e,a})),n.grammar=e.languages.markup}}},tokenizePlaceholders:{value:function(n,r){if(n.language===r&&n.tokenStack){n.grammar=e.languages[r];var a=0,o=Object.keys(n.tokenStack);!function i(s){for(var l=0;l=o.length);l++){var c=s[l];if("string"==typeof c||c.content&&"string"==typeof c.content){var u=o[a],d=n.tokenStack[u],p="string"==typeof c?c:c.content,f=t(r,u),g=p.indexOf(f);if(g>-1){++a;var h=p.substring(0,g),m=new e.Token(r,e.tokenize(d,n.grammar),"language-"+r,d),b=p.substring(g+f.length),y=[];h&&y.push.apply(y,i([h])),y.push(m),b&&y.push.apply(y,i([b])),"string"==typeof c?s.splice.apply(s,[l,1].concat(y)):c.content=y}}else c.content&&i(c.content)}return s}(n.tokens)}}}})}(Prism)},58704:()=>{Prism.languages.nix={comment:{pattern:/\/\*[\s\S]*?\*\/|#.*/,greedy:!0},string:{pattern:/"(?:[^"\\]|\\[\s\S])*"|''(?:(?!'')[\s\S]|''(?:'|\\|\$\{))*''/,greedy:!0,inside:{interpolation:{pattern:/(^|(?:^|(?!'').)[^\\])\$\{(?:[^{}]|\{[^}]*\})*\}/,lookbehind:!0,inside:null}}},url:[/\b(?:[a-z]{3,7}:\/\/)[\w\-+%~\/.:#=?&]+/,{pattern:/([^\/])(?:[\w\-+%~.:#=?&]*(?!\/\/)[\w\-+%~\/.:#=?&])?(?!\/\/)\/[\w\-+%~\/.:#=?&]*/,lookbehind:!0}],antiquotation:{pattern:/\$(?=\{)/,alias:"important"},number:/\b\d+\b/,keyword:/\b(?:assert|builtins|else|if|in|inherit|let|null|or|then|with)\b/,function:/\b(?:abort|add|all|any|attrNames|attrValues|baseNameOf|compareVersions|concatLists|currentSystem|deepSeq|derivation|dirOf|div|elem(?:At)?|fetch(?:Tarball|url)|filter(?:Source)?|fromJSON|genList|getAttr|getEnv|hasAttr|hashString|head|import|intersectAttrs|is(?:Attrs|Bool|Function|Int|List|Null|String)|length|lessThan|listToAttrs|map|mul|parseDrvName|pathExists|read(?:Dir|File)|removeAttrs|replaceStrings|seq|sort|stringLength|sub(?:string)?|tail|throw|to(?:File|JSON|Path|String|XML)|trace|typeOf)\b|\bfoldl'\B/,boolean:/\b(?:false|true)\b/,operator:/[=!<>]=?|\+\+?|\|\||&&|\/\/|->?|[?@]/,punctuation:/[{}()[\].,:;]/},Prism.languages.nix.string.inside.interpolation.inside=Prism.languages.nix},13210:()=>{Prism.languages.pascal={directive:{pattern:/\{\$[\s\S]*?\}/,greedy:!0,alias:["marco","property"]},comment:{pattern:/\(\*[\s\S]*?\*\)|\{[\s\S]*?\}|\/\/.*/,greedy:!0},string:{pattern:/(?:'(?:''|[^'\r\n])*'(?!')|#[&$%]?[a-f\d]+)+|\^[a-z]/i,greedy:!0},asm:{pattern:/(\basm\b)[\s\S]+?(?=\bend\s*[;[])/i,lookbehind:!0,greedy:!0,inside:null},keyword:[{pattern:/(^|[^&])\b(?:absolute|array|asm|begin|case|const|constructor|destructor|do|downto|else|end|file|for|function|goto|if|implementation|inherited|inline|interface|label|nil|object|of|operator|packed|procedure|program|record|reintroduce|repeat|self|set|string|then|to|type|unit|until|uses|var|while|with)\b/i,lookbehind:!0},{pattern:/(^|[^&])\b(?:dispose|exit|false|new|true)\b/i,lookbehind:!0},{pattern:/(^|[^&])\b(?:class|dispinterface|except|exports|finalization|finally|initialization|inline|library|on|out|packed|property|raise|resourcestring|threadvar|try)\b/i,lookbehind:!0},{pattern:/(^|[^&])\b(?:absolute|abstract|alias|assembler|bitpacked|break|cdecl|continue|cppdecl|cvar|default|deprecated|dynamic|enumerator|experimental|export|external|far|far16|forward|generic|helper|implements|index|interrupt|iochecks|local|message|name|near|nodefault|noreturn|nostackframe|oldfpccall|otherwise|overload|override|pascal|platform|private|protected|public|published|read|register|reintroduce|result|safecall|saveregisters|softfloat|specialize|static|stdcall|stored|strict|unaligned|unimplemented|varargs|virtual|write)\b/i,lookbehind:!0}],number:[/(?:[&%]\d+|\$[a-f\d]+)/i,/\b\d+(?:\.\d+)?(?:e[+-]?\d+)?/i],operator:[/\.\.|\*\*|:=|<[<=>]?|>[>=]?|[+\-*\/]=?|[@^=]/,{pattern:/(^|[^&])\b(?:and|as|div|exclude|in|include|is|mod|not|or|shl|shr|xor)\b/,lookbehind:!0}],punctuation:/\(\.|\.\)|[()\[\]:;,.]/},Prism.languages.pascal.asm.inside=Prism.languages.extend("pascal",{asm:void 0,keyword:void 0,operator:void 0}),Prism.languages.objectpascal=Prism.languages.pascal},80366:()=>{Prism.languages.python={comment:{pattern:/(^|[^\\])#.*/,lookbehind:!0,greedy:!0},"string-interpolation":{pattern:/(?:f|fr|rf)(?:("""|''')[\s\S]*?\1|("|')(?:\\.|(?!\2)[^\\\r\n])*\2)/i,greedy:!0,inside:{interpolation:{pattern:/((?:^|[^{])(?:\{\{)*)\{(?!\{)(?:[^{}]|\{(?!\{)(?:[^{}]|\{(?!\{)(?:[^{}])+\})+\})+\}/,lookbehind:!0,inside:{"format-spec":{pattern:/(:)[^:(){}]+(?=\}$)/,lookbehind:!0},"conversion-option":{pattern:/![sra](?=[:}]$)/,alias:"punctuation"},rest:null}},string:/[\s\S]+/}},"triple-quoted-string":{pattern:/(?:[rub]|br|rb)?("""|''')[\s\S]*?\1/i,greedy:!0,alias:"string"},string:{pattern:/(?:[rub]|br|rb)?("|')(?:\\.|(?!\1)[^\\\r\n])*\1/i,greedy:!0},function:{pattern:/((?:^|\s)def[ \t]+)[a-zA-Z_]\w*(?=\s*\()/g,lookbehind:!0},"class-name":{pattern:/(\bclass\s+)\w+/i,lookbehind:!0},decorator:{pattern:/(^[\t ]*)@\w+(?:\.\w+)*/m,lookbehind:!0,alias:["annotation","punctuation"],inside:{punctuation:/\./}},keyword:/\b(?:_(?=\s*:)|and|as|assert|async|await|break|case|class|continue|def|del|elif|else|except|exec|finally|for|from|global|if|import|in|is|lambda|match|nonlocal|not|or|pass|print|raise|return|try|while|with|yield)\b/,builtin:/\b(?:__import__|abs|all|any|apply|ascii|basestring|bin|bool|buffer|bytearray|bytes|callable|chr|classmethod|cmp|coerce|compile|complex|delattr|dict|dir|divmod|enumerate|eval|execfile|file|filter|float|format|frozenset|getattr|globals|hasattr|hash|help|hex|id|input|int|intern|isinstance|issubclass|iter|len|list|locals|long|map|max|memoryview|min|next|object|oct|open|ord|pow|property|range|raw_input|reduce|reload|repr|reversed|round|set|setattr|slice|sorted|staticmethod|str|sum|super|tuple|type|unichr|unicode|vars|xrange|zip)\b/,boolean:/\b(?:False|None|True)\b/,number:/\b0(?:b(?:_?[01])+|o(?:_?[0-7])+|x(?:_?[a-f0-9])+)\b|(?:\b\d+(?:_\d+)*(?:\.(?:\d+(?:_\d+)*)?)?|\B\.\d+(?:_\d+)*)(?:e[+-]?\d+(?:_\d+)*)?j?(?!\w)/i,operator:/[-+%=]=?|!=|:=|\*\*?=?|\/\/?=?|<[<=>]?|>[=>]?|[&|^~]/,punctuation:/[{}[\];(),.:]/},Prism.languages.python["string-interpolation"].inside.interpolation.inside.rest=Prism.languages.python,Prism.languages.py=Prism.languages.python},59385:()=>{!function(e){e.languages.ruby=e.languages.extend("clike",{comment:{pattern:/#.*|^=begin\s[\s\S]*?^=end/m,greedy:!0},"class-name":{pattern:/(\b(?:class|module)\s+|\bcatch\s+\()[\w.\\]+|\b[A-Z_]\w*(?=\s*\.\s*new\b)/,lookbehind:!0,inside:{punctuation:/[.\\]/}},keyword:/\b(?:BEGIN|END|alias|and|begin|break|case|class|def|define_method|defined|do|each|else|elsif|end|ensure|extend|for|if|in|include|module|new|next|nil|not|or|prepend|private|protected|public|raise|redo|require|rescue|retry|return|self|super|then|throw|undef|unless|until|when|while|yield)\b/,operator:/\.{2,3}|&\.|===||[!=]?~|(?:&&|\|\||<<|>>|\*\*|[+\-*/%<>!^&|=])=?|[?:]/,punctuation:/[(){}[\].,;]/}),e.languages.insertBefore("ruby","operator",{"double-colon":{pattern:/::/,alias:"punctuation"}});var t={pattern:/((?:^|[^\\])(?:\\{2})*)#\{(?:[^{}]|\{[^{}]*\})*\}/,lookbehind:!0,inside:{content:{pattern:/^(#\{)[\s\S]+(?=\}$)/,lookbehind:!0,inside:e.languages.ruby},delimiter:{pattern:/^#\{|\}$/,alias:"punctuation"}}};delete e.languages.ruby.function;var n="(?:"+[/([^a-zA-Z0-9\s{(\[<=])(?:(?!\1)[^\\]|\\[\s\S])*\1/.source,/\((?:[^()\\]|\\[\s\S]|\((?:[^()\\]|\\[\s\S])*\))*\)/.source,/\{(?:[^{}\\]|\\[\s\S]|\{(?:[^{}\\]|\\[\s\S])*\})*\}/.source,/\[(?:[^\[\]\\]|\\[\s\S]|\[(?:[^\[\]\\]|\\[\s\S])*\])*\]/.source,/<(?:[^<>\\]|\\[\s\S]|<(?:[^<>\\]|\\[\s\S])*>)*>/.source].join("|")+")",r=/(?:"(?:\\.|[^"\\\r\n])*"|(?:\b[a-zA-Z_]\w*|[^\s\0-\x7F]+)[?!]?|\$.)/.source;e.languages.insertBefore("ruby","keyword",{"regex-literal":[{pattern:RegExp(/%r/.source+n+/[egimnosux]{0,6}/.source),greedy:!0,inside:{interpolation:t,regex:/[\s\S]+/}},{pattern:/(^|[^/])\/(?!\/)(?:\[[^\r\n\]]+\]|\\.|[^[/\\\r\n])+\/[egimnosux]{0,6}(?=\s*(?:$|[\r\n,.;})#]))/,lookbehind:!0,greedy:!0,inside:{interpolation:t,regex:/[\s\S]+/}}],variable:/[@$]+[a-zA-Z_]\w*(?:[?!]|\b)/,symbol:[{pattern:RegExp(/(^|[^:]):/.source+r),lookbehind:!0,greedy:!0},{pattern:RegExp(/([\r\n{(,][ \t]*)/.source+r+/(?=:(?!:))/.source),lookbehind:!0,greedy:!0}],"method-definition":{pattern:/(\bdef\s+)\w+(?:\s*\.\s*\w+)?/,lookbehind:!0,inside:{function:/\b\w+$/,keyword:/^self\b/,"class-name":/^\w+/,punctuation:/\./}}}),e.languages.insertBefore("ruby","string",{"string-literal":[{pattern:RegExp(/%[qQiIwWs]?/.source+n),greedy:!0,inside:{interpolation:t,string:/[\s\S]+/}},{pattern:/("|')(?:#\{[^}]+\}|#(?!\{)|\\(?:\r\n|[\s\S])|(?!\1)[^\\#\r\n])*\1/,greedy:!0,inside:{interpolation:t,string:/[\s\S]+/}},{pattern:/<<[-~]?([a-z_]\w*)[\r\n](?:.*[\r\n])*?[\t ]*\1/i,alias:"heredoc-string",greedy:!0,inside:{delimiter:{pattern:/^<<[-~]?[a-z_]\w*|\b[a-z_]\w*$/i,inside:{symbol:/\b\w+/,punctuation:/^<<[-~]?/}},interpolation:t,string:/[\s\S]+/}},{pattern:/<<[-~]?'([a-z_]\w*)'[\r\n](?:.*[\r\n])*?[\t ]*\1/i,alias:"heredoc-string",greedy:!0,inside:{delimiter:{pattern:/^<<[-~]?'[a-z_]\w*'|\b[a-z_]\w*$/i,inside:{symbol:/\b\w+/,punctuation:/^<<[-~]?'|'$/}},string:/[\s\S]+/}}],"command-literal":[{pattern:RegExp(/%x/.source+n),greedy:!0,inside:{interpolation:t,command:{pattern:/[\s\S]+/,alias:"string"}}},{pattern:/`(?:#\{[^}]+\}|#(?!\{)|\\(?:\r\n|[\s\S])|[^\\`#\r\n])*`/,greedy:!0,inside:{interpolation:t,command:{pattern:/[\s\S]+/,alias:"string"}}}]}),delete e.languages.ruby.string,e.languages.insertBefore("ruby","number",{builtin:/\b(?:Array|Bignum|Binding|Class|Continuation|Dir|Exception|FalseClass|File|Fixnum|Float|Hash|IO|Integer|MatchData|Method|Module|NilClass|Numeric|Object|Proc|Range|Regexp|Stat|String|Struct|Symbol|TMS|Thread|ThreadGroup|Time|TrueClass)\b/,constant:/\b[A-Z][A-Z0-9_]*(?:[?!]|\b)/}),e.languages.rb=e.languages.ruby}(Prism)},70767:()=>{!function(e){for(var t=/\/\*(?:[^*/]|\*(?!\/)|\/(?!\*)|)*\*\//.source,n=0;n<2;n++)t=t.replace(//g,(function(){return t}));t=t.replace(//g,(function(){return/[^\s\S]/.source})),e.languages.rust={comment:[{pattern:RegExp(/(^|[^\\])/.source+t),lookbehind:!0,greedy:!0},{pattern:/(^|[^\\:])\/\/.*/,lookbehind:!0,greedy:!0}],string:{pattern:/b?"(?:\\[\s\S]|[^\\"])*"|b?r(#*)"(?:[^"]|"(?!\1))*"\1/,greedy:!0},char:{pattern:/b?'(?:\\(?:x[0-7][\da-fA-F]|u\{(?:[\da-fA-F]_*){1,6}\}|.)|[^\\\r\n\t'])'/,greedy:!0},attribute:{pattern:/#!?\[(?:[^\[\]"]|"(?:\\[\s\S]|[^\\"])*")*\]/,greedy:!0,alias:"attr-name",inside:{string:null}},"closure-params":{pattern:/([=(,:]\s*|\bmove\s*)\|[^|]*\||\|[^|]*\|(?=\s*(?:\{|->))/,lookbehind:!0,greedy:!0,inside:{"closure-punctuation":{pattern:/^\||\|$/,alias:"punctuation"},rest:null}},"lifetime-annotation":{pattern:/'\w+/,alias:"symbol"},"fragment-specifier":{pattern:/(\$\w+:)[a-z]+/,lookbehind:!0,alias:"punctuation"},variable:/\$\w+/,"function-definition":{pattern:/(\bfn\s+)\w+/,lookbehind:!0,alias:"function"},"type-definition":{pattern:/(\b(?:enum|struct|trait|type|union)\s+)\w+/,lookbehind:!0,alias:"class-name"},"module-declaration":[{pattern:/(\b(?:crate|mod)\s+)[a-z][a-z_\d]*/,lookbehind:!0,alias:"namespace"},{pattern:/(\b(?:crate|self|super)\s*)::\s*[a-z][a-z_\d]*\b(?:\s*::(?:\s*[a-z][a-z_\d]*\s*::)*)?/,lookbehind:!0,alias:"namespace",inside:{punctuation:/::/}}],keyword:[/\b(?:Self|abstract|as|async|await|become|box|break|const|continue|crate|do|dyn|else|enum|extern|final|fn|for|if|impl|in|let|loop|macro|match|mod|move|mut|override|priv|pub|ref|return|self|static|struct|super|trait|try|type|typeof|union|unsafe|unsized|use|virtual|where|while|yield)\b/,/\b(?:bool|char|f(?:32|64)|[ui](?:8|16|32|64|128|size)|str)\b/],function:/\b[a-z_]\w*(?=\s*(?:::\s*<|\())/,macro:{pattern:/\b\w+!/,alias:"property"},constant:/\b[A-Z_][A-Z_\d]+\b/,"class-name":/\b[A-Z]\w*\b/,namespace:{pattern:/(?:\b[a-z][a-z_\d]*\s*::\s*)*\b[a-z][a-z_\d]*\s*::(?!\s*<)/,inside:{punctuation:/::/}},number:/\b(?:0x[\dA-Fa-f](?:_?[\dA-Fa-f])*|0o[0-7](?:_?[0-7])*|0b[01](?:_?[01])*|(?:(?:\d(?:_?\d)*)?\.)?\d(?:_?\d)*(?:[Ee][+-]?\d+)?)(?:_?(?:f32|f64|[iu](?:8|16|32|64|size)?))?\b/,boolean:/\b(?:false|true)\b/,punctuation:/->|\.\.=|\.{1,3}|::|[{}[\];(),:]/,operator:/[-+*\/%!^]=?|=[=>]?|&[&=]?|\|[|=]?|<>?=?|[@?]/},e.languages.rust["closure-params"].inside.rest=e.languages.rust,e.languages.rust.attribute.inside.string=e.languages.rust.string}(Prism)},30218:(e,t,n)=>{var r={"./prism-ada":85795,"./prism-bash":57874,"./prism-csharp":79016,"./prism-dot":60397,"./prism-haskell":81295,"./prism-java":52503,"./prism-nix":58704,"./prism-pascal":13210,"./prism-python":80366,"./prism-ruby":59385,"./prism-rust":70767};function a(e){var t=o(e);return n(t)}function o(e){if(!n.o(r,e)){var t=new Error("Cannot find module '"+e+"'");throw t.code="MODULE_NOT_FOUND",t}return r[e]}a.keys=function(){return Object.keys(r)},a.resolve=o,e.exports=a,a.id=30218},92703:(e,t,n)=>{"use strict";var r=n(50414);function a(){}function o(){}o.resetWarningCache=a,e.exports=function(){function e(e,t,n,a,o,i){if(i!==r){var s=new Error("Calling PropTypes validators directly is not supported by the `prop-types` package. Use PropTypes.checkPropTypes() to call them. Read more at http://fb.me/use-check-prop-types");throw s.name="Invariant Violation",s}}function t(){return e}e.isRequired=e;var n={array:e,bigint:e,bool:e,func:e,number:e,object:e,string:e,symbol:e,any:e,arrayOf:t,element:e,elementType:e,instanceOf:t,node:e,objectOf:t,oneOf:t,oneOfType:t,shape:t,exact:t,checkPropTypes:o,resetWarningCache:a};return n.PropTypes=n,n}},45697:(e,t,n)=>{e.exports=n(92703)()},50414:e=>{"use strict";e.exports="SECRET_DO_NOT_PASS_THIS_OR_YOU_WILL_BE_FIRED"},64448:(e,t,n)=>{"use strict";var r=n(67294),a=n(63840);function o(e){for(var t="https://reactjs.org/docs/error-decoder.html?invariant="+e,n=1;n

Submitting

In case you have any questions, feel free to reach out to me.

-
+
\ No newline at end of file diff --git a/c/category/bonuses/index.html b/c/category/bonuses/index.html index 61fdfe0..a058beb 100644 --- a/c/category/bonuses/index.html +++ b/c/category/bonuses/index.html @@ -16,8 +16,8 @@ - - + +

Bonuses

Bonus assignments for Kontr Coins. diff --git a/c/category/practice-exams/index.html b/c/category/practice-exams/index.html index 9893acc..2da489e 100644 --- a/c/category/practice-exams/index.html +++ b/c/category/practice-exams/index.html @@ -16,8 +16,8 @@ - - + +

Practice Exams

Practice exams for training for the final exam. diff --git a/c/index.html b/c/index.html index f1f8a6e..1e2cf6c 100644 --- a/c/index.html +++ b/c/index.html @@ -14,10 +14,10 @@ - - + + -

+ \ No newline at end of file diff --git a/c/mr/index.html b/c/mr/index.html index 15eebe5..607ed60 100644 --- a/c/mr/index.html +++ b/c/mr/index.html @@ -14,8 +14,8 @@ - - + +

Submitting merge requests for review

@@ -87,6 +87,6 @@ For the sake of safety, do not continue without clean repository. Then with comm be main or trunk.

aisa$ git status
# Check if repository is clean

# If you know, what is your default branch, you can skip next command.
aisa$ git branch
# Find the default branch in the list; should be one of the `master`, `main` or
# `trunk` and you should not have more than one of those.
# In case the list clears the terminal and you cannot see shell prompt, you can
# press `q` to quit the pager.

aisa$ git checkout master

-

Adapted from: https://www.fi.muni.cz/~xlacko1/pb071/mr.html

+

Adapted from: https://www.fi.muni.cz/~xlacko1/pb071/mr.html

\ No newline at end of file diff --git a/c/pexam/cams/index.html b/c/pexam/cams/index.html index bf2b316..076e30b 100644 --- a/c/pexam/cams/index.html +++ b/c/pexam/cams/index.html @@ -16,8 +16,8 @@ - - + +

Watching Cams

diff --git a/c/pexam/garbage_collect/index.html b/c/pexam/garbage_collect/index.html index fed99c6..052f5fc 100644 --- a/c/pexam/garbage_collect/index.html +++ b/c/pexam/garbage_collect/index.html @@ -16,8 +16,8 @@ - - + +

Garbage Collection

diff --git a/contributions/index.html b/contributions/index.html index 8e7d1bf..c21e069 100644 --- a/contributions/index.html +++ b/contributions/index.html @@ -14,8 +14,8 @@ - - + +

Contributions

Many of my contributions to open-source projects.

centpkg

Description

A tool for working with CentOS dist-git.

Contribution

I have fixed a bug that caused centpkg-sig to be unable to clone the dist-git repos from SIGs.

Fedora Messaging

Description

A library for sending AMQP messages with JSON schema in Fedora infrastructure.

Contribution

I contributed a small packaging fix that has been introduced by a new feature.

flexmock

Description

Flexmock is a testing library for Python that makes it easy to create mocks, stubs, and fakes.

Contribution

I've converted the interception for pytest after they've changed their internal design to use pytest's hook system.

tmt

Description

The tmt tool provides a user-friendly way to work with tests. You can comfortably create new tests, safely and easily run tests across different environments, review test results, debug test code and enable tests in the CI using a consistent and concise config.

Contribution

Just a smallish contribution to the docs related to the changes implemented on the Packit side.

Fedora Infrastructure Ansible

Description

Collection of Ansible playbooks that powers the Fedora Infrastructure.

Contribution

I have adjusted the groups in the Bodhi playbooks after Packit has been granted the privileges to propose updates without restrictions.

Bodhi

Description

Bodhi is a web-system that facilitates the process of publishing updates for a Fedora-based software distribution.

Contribution

I have adjusted the client, so that it doesn't show secrets in terminal when you log in to the Bodhi via browser.

Gluetool Modules Collection

Description

Modules for gluetool — a command line centric framework usable for glueing modules into a pipeline.

Contribution
  • I have proposed a possible implementation of git merging that was later on extended.
  • I have tried to help out with Copr module after they deprecated older version of their API.

Pagure

Description

Pagure is a git-centered forge, python based using pygit2.

Contribution

I have added an API endpoint for reopening pull requests.

Copr

Description

RPM build system - upstream for Copr.

Contribution
  • Supporting external repositories for custom SRPM build method.
  • Allowing admins of Copr repositories to build without the need to ask for explicit builder permissions.

python-gitlab

Description

A python wrapper for the GitLab API.

Contribution

I have contributed support for the merge_ref on merge requests that hasn't been supported, yet it was present in the GitLab API.

PatternFly React

Description

A set of React components for the PatternFly project.

Contribution

When working on Packit Dashboard, I have spotted smaller bugs that were present in this project and fixed them upstream to provide better experience for our users.

Fira Code

Description

Free monospaced font with programming ligatures.

Contribution

I have set up a GitHub Action for building the font on each push to the default branch allowing users to install bleeding edge version of the font.

nixpkgs

Description

Nixpkgs is a collection of over 80,000 software packages that can be installed with the Nix package manager. It also implements NixOS, a purely-functional Linux distribution.

Contribution

When I was trying out the nixpkgs, I have tried to bump .NET Core to the latest version. My changes haven't been accepted as they required bumping of multiple more packages that depended upon the .NET Core.

Darcula

Description

A theme for Visual Studio Code based on Darcula theme from Jetbrains IDEs.

Contribution

I have contributed support for diff files, though the project doesn't seem to be live anymore, so it hasn't been accepted as of now.

Packit

Description

An open source project aiming to ease the integration of your project with Fedora Linux, CentOS Stream and other distributions.

Contribution

Have a look at my pull requests.

Snitch

Description

Language agnostic tool that collects TODOs in the source code and reports them as Issues.

Contribution
  • Environment variable support for self-hosted GitLab instances
  • GitLab support

Karel the Robot

Description

Karel the robot is in general an educational programming language for beginners, created by Richard E. Pattis. This is implementation of Karel the Robot for C programming language.

This project is used for educational purposes at TUKE.

Contribution

I have contributed some refactoring tips to the author of the library.

diff --git a/cpp/category/exceptions-and-raii/index.html b/cpp/category/exceptions-and-raii/index.html index 0598cce..02df41d 100644 --- a/cpp/category/exceptions-and-raii/index.html +++ b/cpp/category/exceptions-and-raii/index.html @@ -16,8 +16,8 @@ - - + +

Exceptions and RAII

Materials related to the exceptions or RAII in C++. diff --git a/cpp/environment/index.html b/cpp/environment/index.html index 469c617..6b3e5c3 100644 --- a/cpp/environment/index.html +++ b/cpp/environment/index.html @@ -16,8 +16,8 @@ - - + +

Environment

Required tools per OS

diff --git a/cpp/exceptions-and-raii/placeholders/index.html b/cpp/exceptions-and-raii/placeholders/index.html index 5584789..306f5f2 100644 --- a/cpp/exceptions-and-raii/placeholders/index.html +++ b/cpp/exceptions-and-raii/placeholders/index.html @@ -16,8 +16,8 @@ - - + +

Placeholders

Here we will try to implement some placeholders that you can find in other diff --git a/cpp/index.html b/cpp/index.html index 1e047d2..07778bc 100644 --- a/cpp/index.html +++ b/cpp/index.html @@ -14,10 +14,10 @@ - - + + -

+ \ No newline at end of file diff --git a/files/algorithms/graphs/iterative-and-iterators.tar.bz2 b/files/algorithms/graphs/iterative-and-iterators.tar.bz2 index c144579..61a793e 100644 Binary files a/files/algorithms/graphs/iterative-and-iterators.tar.bz2 and b/files/algorithms/graphs/iterative-and-iterators.tar.bz2 differ diff --git a/files/algorithms/graphs/iterative-and-iterators.tar.gz b/files/algorithms/graphs/iterative-and-iterators.tar.gz index 4a292a4..b726844 100644 Binary files a/files/algorithms/graphs/iterative-and-iterators.tar.gz and b/files/algorithms/graphs/iterative-and-iterators.tar.gz differ diff --git a/files/algorithms/paths/bf-to-astar.tar.bz2 b/files/algorithms/paths/bf-to-astar.tar.bz2 index 9a64f9b..94cce24 100644 Binary files a/files/algorithms/paths/bf-to-astar.tar.bz2 and b/files/algorithms/paths/bf-to-astar.tar.bz2 differ diff --git a/files/algorithms/paths/bf-to-astar.tar.gz b/files/algorithms/paths/bf-to-astar.tar.gz index 316ee4e..712ef0b 100644 Binary files a/files/algorithms/paths/bf-to-astar.tar.gz and b/files/algorithms/paths/bf-to-astar.tar.gz differ diff --git a/files/algorithms/recursion/karel-1.tar.bz2 b/files/algorithms/recursion/karel-1.tar.bz2 index ab40ff7..9698653 100644 Binary files a/files/algorithms/recursion/karel-1.tar.bz2 and b/files/algorithms/recursion/karel-1.tar.bz2 differ diff --git a/files/algorithms/recursion/karel-1.tar.gz b/files/algorithms/recursion/karel-1.tar.gz index 16ac756..27b80e3 100644 Binary files a/files/algorithms/recursion/karel-1.tar.gz and b/files/algorithms/recursion/karel-1.tar.gz differ diff --git a/files/algorithms/recursion/pyramid-slide-down.tar.bz2 b/files/algorithms/recursion/pyramid-slide-down.tar.bz2 index 66332e1..b2375b1 100644 Binary files a/files/algorithms/recursion/pyramid-slide-down.tar.bz2 and b/files/algorithms/recursion/pyramid-slide-down.tar.bz2 differ diff --git a/files/algorithms/recursion/pyramid-slide-down.tar.gz b/files/algorithms/recursion/pyramid-slide-down.tar.gz index cd49eff..48c32c3 100644 Binary files a/files/algorithms/recursion/pyramid-slide-down.tar.gz and b/files/algorithms/recursion/pyramid-slide-down.tar.gz differ diff --git a/files/algorithms/time-complexity/extend.tar.bz2 b/files/algorithms/time-complexity/extend.tar.bz2 index 113f90c..7f5a795 100644 Binary files a/files/algorithms/time-complexity/extend.tar.bz2 and b/files/algorithms/time-complexity/extend.tar.bz2 differ diff --git a/files/algorithms/time-complexity/extend.tar.gz b/files/algorithms/time-complexity/extend.tar.gz index 88b9405..4b2582d 100644 Binary files a/files/algorithms/time-complexity/extend.tar.gz and b/files/algorithms/time-complexity/extend.tar.gz differ diff --git a/files/c/bonuses/03.tar.bz2 b/files/c/bonuses/03.tar.bz2 index 0504060..b9dfc5d 100644 Binary files a/files/c/bonuses/03.tar.bz2 and b/files/c/bonuses/03.tar.bz2 differ diff --git a/files/c/bonuses/03.tar.gz b/files/c/bonuses/03.tar.gz index 9db3c9a..ec870f2 100644 Binary files a/files/c/bonuses/03.tar.gz and b/files/c/bonuses/03.tar.gz differ diff --git a/files/c/bonuses/04.tar.bz2 b/files/c/bonuses/04.tar.bz2 index 4b01282..6cc5a4a 100644 Binary files a/files/c/bonuses/04.tar.bz2 and b/files/c/bonuses/04.tar.bz2 differ diff --git a/files/c/bonuses/04.tar.gz b/files/c/bonuses/04.tar.gz index 7a8af0c..39ed9de 100644 Binary files a/files/c/bonuses/04.tar.gz and b/files/c/bonuses/04.tar.gz differ diff --git a/files/c/bonuses/05-06.tar.bz2 b/files/c/bonuses/05-06.tar.bz2 index 93aca9f..148b634 100644 Binary files a/files/c/bonuses/05-06.tar.bz2 and b/files/c/bonuses/05-06.tar.bz2 differ diff --git a/files/c/bonuses/05-06.tar.gz b/files/c/bonuses/05-06.tar.gz index 223d420..9242034 100644 Binary files a/files/c/bonuses/05-06.tar.gz and b/files/c/bonuses/05-06.tar.gz differ diff --git a/files/c/bonuses/08.tar.bz2 b/files/c/bonuses/08.tar.bz2 index 396f2a5..7b17864 100644 Binary files a/files/c/bonuses/08.tar.bz2 and b/files/c/bonuses/08.tar.bz2 differ diff --git a/files/c/bonuses/08.tar.gz b/files/c/bonuses/08.tar.gz index 59caeab..4d2240c 100644 Binary files a/files/c/bonuses/08.tar.gz and b/files/c/bonuses/08.tar.gz differ diff --git a/files/c/bonuses/10.tar.bz2 b/files/c/bonuses/10.tar.bz2 index 8a28a22..d5ad5f7 100644 Binary files a/files/c/bonuses/10.tar.bz2 and b/files/c/bonuses/10.tar.bz2 differ diff --git a/files/c/bonuses/10.tar.gz b/files/c/bonuses/10.tar.gz index ee9dc94..1d90fe4 100644 Binary files a/files/c/bonuses/10.tar.gz and b/files/c/bonuses/10.tar.gz differ diff --git a/index.html b/index.html index 98c22aa..fc2177a 100644 --- a/index.html +++ b/index.html @@ -14,8 +14,8 @@ - - + +

mf

blog and additional materials for courses at φ

About Me

I'm working in Red Hat in the Packit team and studying at FI MUNI while also tutoring some courses there.

Content

On this page you can find my blog or unofficial materials I have written over the course of teaching multiple courses at the FI.

Mastodon

Feel free to contact me on any of the following Mastodon accounts: Fosstodon or Hachyderm.io

diff --git a/search/index.html b/search/index.html index 897ab32..84b3d93 100644 --- a/search/index.html +++ b/search/index.html @@ -14,8 +14,8 @@ - - + + diff --git a/talks/index.html b/talks/index.html index dc914f5..5ecd68f 100644 --- a/talks/index.html +++ b/talks/index.html @@ -14,8 +14,8 @@ - - + +

Talks

Featured talks I presented on various events.

Shift Left Testing with Packit and Testing Farm

In today's fast-paced software development landscape, ensuring the quality and reliability of upstream contributions is crucial. The traditional approach of testing at the end of the development cycle is no longer sufficient. To address this challenge, we present "Shift Left Your Testing with Packit and Testing Farm", a talk that introduces two powerful tools designed to simplify and enhance the testing process for the upstream contributions.

Packit and Testing Farm provide a dead simple way to build and test your upstream contributions against both public or internal Red Hat testing infrastructure. In this talk, we will explore the capabilities of both tools and demonstrate how they can be seamlessly integrated into your development workflow.

In addition to the current capabilities, we will share our plans for Packit and Testing.

  • QEcamp23
  • virtual
  • 11/2023

Packit: RPM integration, all in one

Do you want to automate how you build and test your RPM packages? Do you maintain any package in Fedora and want to automate the releases? Or are you just interested in CI/CD on GitHub or GitLab, Fedora and integration of upstream projects with RPM-based Linux distributions? In this session, we are going to deep-dive into features of Packit that can help you do your day-to-day job.
  • DevConf.cz
  • Brno, Czechia
  • 6/2023

Also presented on:

  • DevConf.cz Mini in Brno, Czechia (3/2023)

Credits to Paweł Kosiec for implementing his own React components for talks.