Skip to content

object oriented with style controlled by sheets#5

Open
andrewfowlie wants to merge 2 commits intomainfrom
oo_style
Open

object oriented with style controlled by sheets#5
andrewfowlie wants to merge 2 commits intomainfrom
oo_style

Conversation

@andrewfowlie
Copy link
Contributor

this is relatively big PR

  • i made the code interact with matplotlib in an object oriented way, i.e., you can pass in an axes instance and have the code add e.g., the 2D posterior to it. this is much more flexible because users can make their own axes in whatever layout etc that they want
  • the styling is predominantly set by style sheets as follows
from gambit_plotting_tools.gambit_colormaps import register_cmaps

register_cmaps()
plt.style.use(['gambit_plotting_tools.gambit', 'gambit_plotting_tools.dark'])

The register_cmaps is necessary for matplotlib to pickup gambit's defined colormap. there is a dark and a light style

from gambit_plotting_tools.gambit_colormaps import register_cmaps

register_cmaps()
plt.style.use(['gambit_plotting_tools.gambit', 'gambit_plotting_tools.light'])

The dark syle is the usual gambit one for 2D plots, the light one is the usual gambit style for 1D plots + a light colormap. You can use them on 1D or 2D plots though.

@andrewfowlie
Copy link
Contributor Author

Hi Anders, did you check this out?

@anderkve
Copy link
Contributor

Ah, thanks for the reminder, @andrewfowlie! I had forgotten about this one. Will look at it once I'm done with a current project of adding some new plot modes.

ChrisJChang added a commit that referenced this pull request Mar 10, 2026
  #1 — read_hdf5_datasets: eliminate np.append in loop
  Replaced the per-file np.append pattern (which reallocates the full array on every call) with collecting chunks in a list and calling np.concatenate once at the end. For N files of M
   points each, this reduces allocations from O(N) full-array copies to one.

  #2 — fill_nan_with_neighbor_mean: vectorise triple-nested loop
  Replaced the Python for i / for j / for neighbour loop with scipy.ndimage.generic_filter, which operates in compiled C over the entire array in one pass.

  #3 — plot_2D_posterior: deduplicate sorted-histogram computation
  The same np.sort + np.cumsum + normalisation was computed twice in the same function call (once for CR masking, once for contour drawing). Now initialised to None and computed at
  most once, with the second use reusing the cached result.

  #4 — Replace deepcopy with shallow copies throughout
  - Numeric bounds (xy_bounds, x_bounds): replaced with [list(b[0]), list(b[1])] / list(b) — sufficient since only scalars are mutated.
  - requested_datasets (list of immutable tuples): replaced with list().
  - NumPy array slices (y_data[mask], z_data[mask], etc.): removed entirely — fancy indexing already returns a copy.
  - deepcopy(y_data) in posterior shading: replaced with y_data.copy().
  - Removed the now-unused deepcopy import.

  #5 — plot_1D_profile confidence band loop: single mask computation
  The boolean mask (x_values > x_start) & (x_values < x_end) was evaluated twice per segment (once for x, once for y) and converted through a Python list. Now computed once with
  np.concatenate instead of list wrapping.

  #6 — bin_and_profile_2D: nested loop → np.meshgrid
  Replaced the double Python loop over bin centres with a single np.meshgrid call + .ravel(), keeping the same flat-array layout (index = y_bin_index * n_xbins + x_bin_index).

  Bonus — Module-level np.finfo constants
  np.finfo(float).max and .eps (called ~20 times across function signatures and bodies) are now computed once at import time as _FLOAT_MAX and _FLOAT_EPS.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants