Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP] Handle sidebar groups config update #5191

Draft
wants to merge 6 commits into
base: develop
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -34,17 +34,17 @@ import {
useRecoilStateLoadable,
useRecoilValueLoadable,
} from "recoil";
import { collapseFields, getCurrentEnvironment } from "../utils";
import * as atoms from "./atoms";
import { getBrowserStorageEffectForKey } from "./customEffects";
import { collapseFields, getCurrentEnvironment } from "../../utils";
import * as atoms from "../atoms";
import { getBrowserStorageEffectForKey } from "../customEffects";
import {
active3dSlices,
active3dSlicesToSampleMap,
activeModalSidebarSample,
pinned3DSampleSlice,
} from "./groups";
import { isLargeVideo } from "./options";
import { cumulativeValues, values } from "./pathData";
} from "../groups";
import { isLargeVideo } from "../options";
import { cumulativeValues, values } from "../pathData";
import {
buildSchema,
field,
Expand All @@ -53,23 +53,24 @@ import {
filterPaths,
isOfDocumentFieldList,
pathIsShown,
} from "./schema";
import { isFieldVisibilityActive } from "./schemaSettings.atoms";
} from "../schema";
import { isFieldVisibilityActive } from "../schemaSettings.atoms";
import {
datasetName,
disableFrameFiltering,
isVideoDataset,
stateSubscription,
} from "./selectors";
import { State } from "./types";
} from "../selectors";
import { State } from "../types";
import {
fieldsMatcher,
groupFilter,
labelsMatcher,
primitivesMatcher,
unsupportedMatcher,
} from "./utils";
import * as viewAtoms from "./view";
} from "../utils";
import * as viewAtoms from "../view";
import { mergeGroups } from "./sidebar-utils";

export enum EntryKind {
EMPTY = "EMPTY",
Expand Down Expand Up @@ -210,6 +211,10 @@ export const resolveGroups = (
? DEFAULT_VIDEO_GROUPS
: DEFAULT_IMAGE_GROUPS;

if (currentGroups.length && configGroups.length) {
groups = mergeGroups(groups, configGroups);
}

const expanded = configGroups.reduce((map, { name, expanded }) => {
map[name] = expanded;
return map;
Expand Down
37 changes: 37 additions & 0 deletions app/packages/state/src/recoil/sidebar/sidebar-utils.test.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
import { describe, expect, it, vi } from "vitest";

vi.mock("recoil");
vi.mock("recoil-relay");

import { merge, mergeGroups } from "./sidebar-utils";

describe("test sidebar groups resolution", () => {
it("test list merge", () => {
expect(merge([], ["one", "two"])).toStrictEqual(["one", "two"]);
});
Comment on lines +8 to +11
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Enhance test coverage with additional test cases

The current test only covers merging an empty array with a non-empty array. Consider adding test cases for:

  • Two non-empty arrays
  • Arrays with duplicate elements
  • Arrays with different data types (if applicable)
  • Edge cases like null/undefined inputs
 describe("test sidebar groups resolution", () => {
   it("test list merge", () => {
     expect(merge([], ["one", "two"])).toStrictEqual(["one", "two"]);
+    expect(merge(["one"], ["two"])).toStrictEqual(["one", "two"]);
+    expect(merge(["one"], ["one", "two"])).toStrictEqual(["one", "two"]);
+    expect(merge(null, ["one"])).toStrictEqual(["one"]);
   });

Committable suggestion skipped: line range outside the PR's diff.


it("merges current and config groups", () => {
expect(
mergeGroups(
[
{ name: "one", paths: ["one.one", "one.three"] },
{ name: "three", paths: [] },
],

[
{ name: "zero", paths: [] },
{
name: "one",
paths: ["one.zero", "one.one", "one.two"],
},
{ name: "two", paths: [] },
]
)
).toStrictEqual([
{ name: "zero", paths: [] },
{ name: "one", paths: ["one.zero", "one.one", "one.two", "one.three"] },
{ name: "two", paths: [] },
{ name: "three", paths: [] },
]);
});
});
Comment on lines +13 to +37
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Add test cases for edge cases and error handling

While the current test case covers the happy path, consider adding tests for:

  1. Groups with identical paths
  2. Empty or invalid group names
  3. Malformed group objects
  4. Error handling scenarios

Also, consider breaking down this large test into smaller, more focused test cases for better maintainability.

   it("merges current and config groups", () => {
     expect(
       mergeGroups(
         [
           { name: "one", paths: ["one.one", "one.three"] },
           { name: "three", paths: [] },
         ],
         [
           { name: "zero", paths: [] },
           {
             name: "one",
             paths: ["one.zero", "one.one", "one.two"],
           },
           { name: "two", paths: [] },
         ]
       )
     ).toStrictEqual([/*...*/]);
   });
+
+  it("handles groups with identical paths", () => {
+    expect(
+      mergeGroups(
+        [{ name: "one", paths: ["path1"] }],
+        [{ name: "one", paths: ["path1"] }]
+      )
+    ).toStrictEqual([
+      { name: "one", paths: ["path1"] }
+    ]);
+  });
+
+  it("validates group names", () => {
+    expect(() => 
+      mergeGroups(
+        [{ name: "", paths: [] }],
+        []
+      )
+    ).toThrow();
+  });

Committable suggestion skipped: line range outside the PR's diff.

89 changes: 89 additions & 0 deletions app/packages/state/src/recoil/sidebar/sidebar-utils.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,89 @@
import type { State } from "../types";

const hasNeighbor = (sink: string[], source: string[], key: string) => {
const index = source.indexOf(key);
const before = source[index - 1];
const after = source[index + 1];

return sink.includes(before) || sink.includes(after);
};

const insertFromNeighbor = (sink: string[], source: string[], key: string) => {
if (sink.includes(key)) {
return;
}

const sourceIndex = source.indexOf(key);
const before = source[sourceIndex - 1];
const beforeIndex = sink.indexOf(before);

if (beforeIndex >= 0) {
sink.splice(beforeIndex + 1, 0, key);
return;
}

const after = source[sourceIndex + 1];
const afterIndex = sink.indexOf(after);

if (afterIndex >= 0) {
sink.splice(afterIndex, 0, key);
return;
}

sink.push(key);
return;
};
Comment on lines +11 to +35
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Document the insertion strategy and consider immutability

The function has complex insertion logic that would benefit from documentation. Also, mutating the input array directly could lead to unexpected side effects.

Consider:

  1. Adding JSDoc comments explaining the insertion strategy
  2. Making the function pure by returning a new array instead of mutating the input
+/**
+ * Inserts a key into the sink array based on its neighbors' positions in the source array.
+ * The insertion strategy prioritizes:
+ * 1. After an existing "before" neighbor
+ * 2. Before an existing "after" neighbor
+ * 3. Appends to the end if no neighbors exist
+ * @returns A new array with the inserted key
+ */
-const insertFromNeighbor = (sink: string[], source: string[], key: string) => {
+const insertFromNeighbor = (sink: string[], source: string[], key: string): string[] => {
+  const result = [...sink];
   if (sink.includes(key)) {
-    return;
+    return result;
   }
   // ... rest of the logic, operating on result instead of sink
-  return;
+  return result;
 };

Committable suggestion skipped: line range outside the PR's diff.


export const merge = (sink: string[], source: string[]) => {
const missing = new Set(source.filter((key) => !sink.includes(key)));

while (missing.size) {
const force = [...missing].every(
(name) => !hasNeighbor(sink, source, name)
);
for (const name of missing) {
if (!force && !hasNeighbor(sink, source, name)) {
continue;
}

insertFromNeighbor(sink, source, name);

missing.delete(name);
}
}

return sink;
};
Comment on lines +37 to +56
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Add safety guard and improve type annotations

The while loop could potentially run indefinitely if the conditions aren't met. Also, the function would benefit from explicit return type annotation.

-export const merge = (sink: string[], source: string[]) => {
+export const merge = (sink: string[], source: string[]): string[] => {
   const missing = new Set(source.filter((key) => !sink.includes(key)));
+  let iterations = 0;
+  const MAX_ITERATIONS = 1000; // Adjust based on expected maximum size
 
-  while (missing.size) {
+  while (missing.size && iterations++ < MAX_ITERATIONS) {
     const force = [...missing].every(
       (name) => !hasNeighbor(sink, source, name)
     );
     // ... rest of the logic
   }
 
+  if (iterations >= MAX_ITERATIONS) {
+    console.warn('Maximum iterations reached in merge function');
+  }
 
   return sink;
 };
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
export const merge = (sink: string[], source: string[]) => {
const missing = new Set(source.filter((key) => !sink.includes(key)));
while (missing.size) {
const force = [...missing].every(
(name) => !hasNeighbor(sink, source, name)
);
for (const name of missing) {
if (!force && !hasNeighbor(sink, source, name)) {
continue;
}
insertFromNeighbor(sink, source, name);
missing.delete(name);
}
}
return sink;
};
export const merge = (sink: string[], source: string[]): string[] => {
const missing = new Set(source.filter((key) => !sink.includes(key)));
let iterations = 0;
const MAX_ITERATIONS = 1000; // Adjust based on expected maximum size
while (missing.size && iterations++ < MAX_ITERATIONS) {
const force = [...missing].every(
(name) => !hasNeighbor(sink, source, name)
);
for (const name of missing) {
if (!force && !hasNeighbor(sink, source, name)) {
continue;
}
insertFromNeighbor(sink, source, name);
missing.delete(name);
}
}
if (iterations >= MAX_ITERATIONS) {
console.warn('Maximum iterations reached in merge function');
}
return sink;
};


export const mergeGroups = (
sink: State.SidebarGroup[],
source: State.SidebarGroup[]
) => {
// make copies, assume readonly
const mapping = Object.fromEntries(sink.map((g) => [g.name, { ...g }]));
const configMapping = Object.fromEntries(
source.map((g) => [g.name, { ...g }])
);

const sinkKeys = sink.map(({ name }) => name);
const sourceKeys = source.map(({ name }) => name);

merge(sinkKeys, sourceKeys);

for (const key of sinkKeys) {
mapping[key] = mapping[key] ?? configMapping[key];
}
const resolved = sinkKeys.map((g) => mapping[g] ?? configMapping[g]);
for (const { name } of resolved) {
const i = sourceKeys.indexOf(name);
if (i < 0) {
continue;
}

// make copies, assume readonly
mapping[name].paths = [...(mapping[name].paths ?? [])];
merge(mapping[name].paths, [...source[i].paths]);
}

return resolved;
};
5 changes: 4 additions & 1 deletion fiftyone/core/singletons.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@
| `voxel51.com <https://voxel51.com/>`_
|
"""

from collections import defaultdict
import weakref

Expand Down Expand Up @@ -37,7 +38,9 @@ def __call__(cls, name=None, _create=True, *args, **kwargs):
name = instance.name # `__init__` may have changed `name`
else:
try:
instance._update_last_loaded_at()
instance._update_last_loaded_at(
force=kwargs.get("_force_load", False)
)
except ValueError:
instance._deleted = True
return cls.__call__(
Expand Down
4 changes: 3 additions & 1 deletion fiftyone/core/state.py
Original file line number Diff line number Diff line change
Expand Up @@ -97,7 +97,9 @@ def serialize(self, reflective=True):

if self.dataset is not None:
d["dataset"] = self.dataset.name
collection = self.dataset
collection = fo.Dataset(
self.dataset.name, _create=False, _force_load=True
)
if self.view is not None:
collection = self.view

Expand Down
6 changes: 1 addition & 5 deletions fiftyone/server/query.py
Original file line number Diff line number Diff line change
Expand Up @@ -585,11 +585,7 @@ def run():
if not fod.dataset_exists(dataset_name):
return None

dataset = fod.load_dataset(dataset_name)

if update_last_loaded_at:
dataset._update_last_loaded_at(force=True)

dataset = fo.Dataset(dataset_name, _create=False, _force_load=True)
dataset.reload()
view_name = None
try:
Expand Down
Loading