Skip to content

Commit

Permalink
Merge branch 'main' into feat/graph-primitives
Browse files Browse the repository at this point in the history
  • Loading branch information
grtlr authored Nov 25, 2024
2 parents 27f5e6a + 1c9b942 commit a31f84d
Show file tree
Hide file tree
Showing 4 changed files with 20 additions and 96 deletions.
21 changes: 9 additions & 12 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,11 @@ This is written for anyone who wants to contribute to the Rerun repository.
You can also look at our [`good first issue` tag](https://github.com/rerun-io/rerun/labels/good%20first%20issue).

## Pull requests
We use [Trunk Based Development](https://trunkbaseddevelopment.com/), which means we encourage small, short-lived branches. Open draft PR:s to get some early feedback on your work.
We use [Trunk Based Development](https://trunkbaseddevelopment.com/), which means we encourage small, short-lived branches.

Open draft PR:s to get some early feedback on your work until you feel it is ready for a proper review.
Do not make PR:s from your own `main` branch, as that makes it difficult for reviewers to add their own fixes.
Add any improvements to the branch as new commits instead of rebasing to make it easier for reviewers to follow the progress (add images if possible!).

All PR:s are merged with [`Squash and Merge`](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/incorporating-changes-from-a-pull-request/about-pull-request-merges#squash-and-merge-your-commits), meaning they all get squashed to just one commit on the `main` branch. This means you don't need to keep a clean commit history on your feature branches. In fact, it is preferable to add new commits to a branch rather than rebasing or squashing. For one, it makes it easier to track progress on a branch, but rebasing and force-pushing also discourages collaboration on a branch.

Expand All @@ -34,12 +38,7 @@ Members of the `rerun-io` organization and collaborators in the `rerun-io/rerun`

## Contributing to CI

Every CI job would in its ideal state consist of only two steps:

1. Install tools and libraries[^1]
2. Run a script

In which the script is written and tested locally before being wrapped in a CI workflow file. This does not mean that scripts are merely _reproducible_ locally (though that is also true), it means that they must be written with a _local-first mindset_, as if they are not supposed to run on CI at all.
Every CI job would in its ideal state consist of only a single `pixi` (or similar) script invocation that works locally as-is.

This approach has a number of benefits:
- Instead of Bash embedded in YAML, scripts may be written in an Actual Programming Language™
Expand All @@ -48,8 +47,6 @@ This approach has a number of benefits:

Additionally, always output any artifacts produced by CI to GCS instead of the GHA artifact storage. This can be a serious lifesaver when something breaks, as it allows anyone to download the output of a script and continue from where it failed, instead of being forced to start over from scratch.

[^1]: For some larger jobs, we prefer to use a [docker image](https://hub.docker.com/r/rerunio/ci_docker) to make managing dependencies simpler, and to keep everything locked to a specific version as much as possible. In this case, it's still good practice to install dependencies, because it ensures the job continues to work even if the docker image is out of date.

Here are some guidelines to follow when writing such scripts:

Local-first means easy for contributors to run.
Expand Down Expand Up @@ -114,7 +111,6 @@ cargo run -p rerun -- --help
## Tools

We use the [`pixi`](https://prefix.dev/) for managing dev-tool versioning, download and task running. To see available tasks, use `pixi task list`.
TODO(andreas): This doesn't list tasks from all Pixi environments. There's no way to this so far, see also [here](https://discord.com/channels/1082332781146800168/1227563080934756475/1227563080934756475).

We use [cargo deny](https://github.com/EmbarkStudios/cargo-deny) to check our dependency tree for copy-left licenses, duplicate dependencies and [rustsec advisories](https://rustsec.org/advisories). You can configure it in `deny.toml`. Usage: `cargo deny check`
Configure your editor to run `cargo fmt` on save. Also configure it to strip trailing whitespace, and to end each file with a newline. Settings for VSCode can be found in the `.vscode` folder and should be applied automatically. If you are using another editor, consider adding good setting to this repository!
Expand All @@ -123,7 +119,7 @@ Depending on the changes you made run `cargo test --all-targets --all-features`,

### Linting
Prior to pushing changes to a PR, at a minimum, you should always run `pixi run fast-lint`. This is designed to run
in a few seconds and should catch the more trivial issues to avoid wasting CI time.
in a few seconds for repeated runs and should catch the more trivial issues to avoid wasting CI time.

### Hooks
We recommend adding the Rerun pre-push hook to your local checkout, which among other-things will run
Expand All @@ -145,4 +141,5 @@ You can use [bacon](https://github.com/Canop/bacon) to automatically check your
You can set up [`sccache`](https://github.com/mozilla/sccache) to speed up re-compilation (e.g. when switching branches). You can control the size of the cache with `export SCCACHE_CACHE_SIZE="256G"`.

### Other
You can view higher log levels with `export RUST_LOG=debug` or `export RUST_LOG=trace`.
You can view higher log levels with `export RUST_LOG=trace`.
Debug logging is automatically enabled for the viewer as long as you're running inside the `rerun` checkout.
20 changes: 10 additions & 10 deletions Cargo.lock
Original file line number Diff line number Diff line change
Expand Up @@ -3117,7 +3117,7 @@ dependencies = [
"http 1.1.0",
"hyper 1.5.0",
"hyper-util",
"rustls 0.23.16",
"rustls 0.23.18",
"rustls-native-certs",
"rustls-pki-types",
"tokio",
Expand Down Expand Up @@ -3572,7 +3572,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4979f22fdb869068da03c9f7528f8297c6fd2606bc3a4affe42e6a823fdb8da4"
dependencies = [
"cfg-if",
"windows-targets 0.52.6",
"windows-targets 0.48.5",
]

[[package]]
Expand Down Expand Up @@ -5070,7 +5070,7 @@ dependencies = [
"quinn-proto",
"quinn-udp",
"rustc-hash 2.0.0",
"rustls 0.23.16",
"rustls 0.23.18",
"socket2",
"thiserror",
"tokio",
Expand All @@ -5087,7 +5087,7 @@ dependencies = [
"rand",
"ring",
"rustc-hash 2.0.0",
"rustls 0.23.16",
"rustls 0.23.18",
"slab",
"thiserror",
"tinyvec",
Expand Down Expand Up @@ -6818,7 +6818,7 @@ dependencies = [
"percent-encoding",
"pin-project-lite",
"quinn",
"rustls 0.23.16",
"rustls 0.23.18",
"rustls-native-certs",
"rustls-pemfile 2.2.0",
"rustls-pki-types",
Expand Down Expand Up @@ -7366,9 +7366,9 @@ dependencies = [

[[package]]
name = "rustls"
version = "0.23.16"
version = "0.23.18"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "eee87ff5d9b36712a58574e12e9f0ea80f915a5b0ac518d322b24a465617925e"
checksum = "9c9cc1d47e243d655ace55ed38201c19ae02c148ae56412ab8750e8f0166ab7f"
dependencies = [
"log",
"once_cell",
Expand Down Expand Up @@ -8379,7 +8379,7 @@ version = "0.26.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0c7bc40d0e5a97695bb96e27995cd3a08538541b0a846f65bba7a359f36700d4"
dependencies = [
"rustls 0.23.16",
"rustls 0.23.18",
"rustls-pki-types",
"tokio",
]
Expand Down Expand Up @@ -8615,7 +8615,7 @@ dependencies = [
"httparse",
"log",
"rand",
"rustls 0.23.16",
"rustls 0.23.18",
"rustls-pki-types",
"sha1",
"thiserror",
Expand Down Expand Up @@ -8732,7 +8732,7 @@ dependencies = [
"flate2",
"log",
"once_cell",
"rustls 0.23.16",
"rustls 0.23.18",
"rustls-pki-types",
"serde",
"serde_json",
Expand Down
73 changes: 0 additions & 73 deletions crates/store/re_log_types/src/hash.rs
Original file line number Diff line number Diff line change
@@ -1,7 +1,3 @@
// ----------------------------------------------------------------------------

use std::hash::BuildHasher;

/// 64-bit hash.
///
/// 10^-12 collision risk with 6k values.
Expand Down Expand Up @@ -61,77 +57,8 @@ impl std::fmt::Debug for Hash64 {

// ----------------------------------------------------------------------------

/// 128-bit hash. Negligible risk for collision.
#[derive(Copy, Clone, Eq)]
pub struct Hash128([u64; 2]);

impl Hash128 {
pub const ZERO: Self = Self([0; 2]);

pub fn hash(value: impl std::hash::Hash + Copy) -> Self {
Self(double_hash(value))
}

#[inline]
pub fn hash64(&self) -> u64 {
self.0[0]
}

#[inline]
pub fn first64(&self) -> u64 {
self.0[0]
}

#[inline]
pub fn second64(&self) -> u64 {
self.0[1]
}
}

impl std::hash::Hash for Hash128 {
#[inline]
fn hash<H: std::hash::Hasher>(&self, state: &mut H) {
state.write_u64(self.0[0]);
}
}

impl std::cmp::PartialEq for Hash128 {
#[inline]
fn eq(&self, other: &Self) -> bool {
self.0 == other.0
}
}

impl nohash_hasher::IsEnabled for Hash128 {}

impl std::fmt::Debug for Hash128 {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
f.write_str(&format!("Hash128({:016X}{:016X})", self.0[0], self.0[1]))
}
}

// ----------------------------------------------------------------------------

pub const HASH_RANDOM_STATE: ahash::RandomState = ahash::RandomState::with_seeds(0, 1, 2, 3);

#[inline]
fn double_hash(value: impl std::hash::Hash + Copy) -> [u64; 2] {
[hash_with_seed(value, 123), hash_with_seed(value, 456)]
}

/// Hash the given value.
#[inline]
fn hash_with_seed(value: impl std::hash::Hash, seed: u128) -> u64 {
use std::hash::Hash as _;
use std::hash::Hasher as _;

// Don't use ahash::AHasher::default() since it uses a random number for seeding the hasher on every application start.
let mut hasher = HASH_RANDOM_STATE.build_hasher();
seed.hash(&mut hasher);
value.hash(&mut hasher);
hasher.finish()
}

/// Hash the given value.
#[inline]
fn hash(value: impl std::hash::Hash) -> u64 {
Expand Down
2 changes: 1 addition & 1 deletion crates/utils/re_string_interner/src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -78,7 +78,7 @@ impl std::cmp::PartialEq for InternedString {
impl std::hash::Hash for InternedString {
#[inline]
fn hash<H: std::hash::Hasher>(&self, state: &mut H) {
self.hash.hash(state);
state.write_u64(self.hash);
}
}

Expand Down

0 comments on commit a31f84d

Please sign in to comment.