Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: calculate only missing contour levels #3224

Closed
wants to merge 5 commits into from
Closed

Conversation

jobo322
Copy link
Member

@jobo322 jobo322 commented Sep 2, 2024

No description provided.

Copy link

cloudflare-workers-and-pages bot commented Sep 2, 2024

Deploying nmrium with  Cloudflare Pages  Cloudflare Pages

Latest commit: f2d29f9
Status: ✅  Deploy successful!
Preview URL: https://b5f095dc.nmrium.pages.dev
Branch Preview URL: https://cache-contours-2d.nmrium.pages.dev

View logs

@targos targos self-assigned this Sep 4, 2024
Copy link
Member

@targos targos left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I will try to point the problems when I have more time but I suggest you spend time learning React on your side because the things you did with the hooks are very wrong.

@jobo322
Copy link
Member Author

jobo322 commented Sep 4, 2024

I will try to point the problems when I have more time but I suggest you spend time learning React on your side because the things you did with the hooks are very wrong.

it were very expected, because I have not idea of react, but I would like show to luc, the behavior of contour plot cache while we wait for hamed, The time of contours generation reduce a lot when all the levels are cached.

@hamed-musallam
Copy link
Member

@targos

Are you planning to refactor the code? If you're busy with something else, I can take care of the refactoring and fix his code.

@targos
Copy link
Member

targos commented Sep 6, 2024

I'm not planning to refactor it, but I would like to review it when it's ready

@targos targos removed their assignment Sep 6, 2024
@hamed-musallam
Copy link
Member

hamed-musallam commented Sep 6, 2024

@jobo322

I created a specialized context to handle contour calculations per nucleus

const contours: Contours = {};
for (const spectrum of spectra) {
const {
id,
info: { isFt },
} = spectrum;
if (!isFt) {
continue;
}
const positive = getContours(spectrum);
const negative = getContours(spectrum, true);
contours[id] = { positive, negative };
}
return contours;

This requires additional work:

  1. Move the contour options from the display object within the spectrum object to view to avoid recalculating the contours when those options change (this can be done later once the reset work as expected)
  2. Calculate all the contour levels once. Please review the drawContours function and make the necessary changes.

return drawContours(
spectrum,
{
contourLevels: [0, 100],
numberOfLayers: 10,
},
negative,
);

We have a custom hook, useContours, which returns an object where each key is a spectrum ID, and the value is an object with the following structure: { positive: { contours: [], timeout: boolean }, negative: { contours: [], timeout: boolean } }.

You need to create a new function to slice the contours (which include all the levels) based on the current zoom level

const level = useContoursLevel(spectrum, sign);
const contours = useContours();
const signContours = contours?.[spectrumID][sign] || {
contours: [],
timeout: false,
};

@jobo322
Copy link
Member Author

jobo322 commented Sep 6, 2024

It is not feasible for big files to calculate all the levels at once.
is it possible to pass the boundaries to useContours?

@lpatiny
Copy link
Member

lpatiny commented Sep 12, 2024

It is not feasible for big files to calculate all the levels at once. is it possible to pass the boundaries to useContours?

@jobo322 Yes this is what I was afraid of. What is the maximal number of points in both directions that seems reasonable to you ?

This would make things much more complex because we can not simply cache the results as we were thinking before.

If we can cache currently the full analysis we should merge this PR once Michael has validated it. We will then think in another issue how to reduce the resolution and have 'quadrants' or something like that when data is too big. I may help on this and add some methods in ml-spectra-processing.

@lpatiny
Copy link
Member

lpatiny commented Sep 13, 2024

This comment just for documentation.

I did many some test with this file:

That seems to me relatively big (2048 x 1024).

I also did a lot of benchmarks creating many contours very close to the noise:

https://github.com/mljs/conrec/tree/main/src/__tests__/data

This specific testcase generates 314'512'788 elements in lines array and generates 101 levels. The process on my mac m2 takes 8.4s.

In reality we should only calculate 10 levels at a time and it should not be such an extreme case. With the new approach we should therefore wait max 1s the first time it is being calculated.

@lpatiny
Copy link
Member

lpatiny commented Sep 13, 2024

@hamed-musallam Are you working on this PR ? Currently it does not work at all seems to me and is very slow.
If it is easier we can close this PR and start over again because I would expect that it should not be such a difficult feature to implement to create an object cache that contains {key: level, value: contour}.

@hamed-musallam
Copy link
Member

@hamed-musallam Are you working on this PR ? Currently it does not work at all seems to me and is very slow. If it is easier we can close this PR and start over again because I would expect that it should not be such a difficult feature to implement to create an object cache that contains {key: level, value: contour}.

No, but I believe Alejandro is working on this PR. It's better to keep it open to preserve the comments, and we can reset the branch to the main

@lpatiny
Copy link
Member

lpatiny commented Sep 13, 2024

You may also chedk:

@lpatiny
Copy link
Member

lpatiny commented Nov 22, 2024

@jobo322 Could you try to rebase this branch ?

@lpatiny
Copy link
Member

lpatiny commented Jan 10, 2025

Will be done in:

@lpatiny lpatiny closed this Jan 10, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants