How to use the adaptive.learner.learner1D._get_neighbors_from_list function in adaptive

To help you get started, we’ve selected a few adaptive examples, based on popular ways it is used in public projects.

Secure your code as it's written. Use Snyk Code to scan source code in minutes - no build needed - and fix issues immediately.

github python-adaptive / adaptive / adaptive / learner / average1D.py View on Github external
self.pending_points[x].difference_update(seeds)
                if len(self.pending_points[x]) == 0:
                    # Remove if pending_points[x] is an empty set.
                    del self.pending_points[x]

        # Below is the same as 'Learner1D.tell_many'.

        # Get all data as numpy arrays
        points = np.array(list(self._data.keys()))
        values = np.array(list(self.data.values()))
        points_pending = np.array(list(self.pending_points))
        points_combined = np.hstack([points_pending, points])

        # Generate neighbors
        self.neighbors = _get_neighbors_from_list(points)
        self.neighbors_combined = _get_neighbors_from_list(points_combined)

        # Update scale
        self._bbox[0] = [points_combined.min(), points_combined.max()]
        self._bbox[1] = [values.min(axis=0), values.max(axis=0)]
        self._scale[0] = self._bbox[0][1] - self._bbox[0][0]
        self._scale[1] = np.max(self._bbox[1][1] - self._bbox[1][0])
        self._oldscale = deepcopy(self._scale)

        # Find the intervals for which the losses should be calculated.
        intervals, intervals_combined = [
            [(x_m, x_r) for x_m, (x_l, x_r) in neighbors.items()][:-1]
            for neighbors in (self.neighbors, self.neighbors_combined)
        ]

        # The the losses for the "real" intervals.
        self.losses = loss_manager(self._scale[0])
github python-adaptive / adaptive / adaptive / learner / average1D.py View on Github external
seeds = dp.keys()
                self.pending_points[x].difference_update(seeds)
                if len(self.pending_points[x]) == 0:
                    # Remove if pending_points[x] is an empty set.
                    del self.pending_points[x]

        # Below is the same as 'Learner1D.tell_many'.

        # Get all data as numpy arrays
        points = np.array(list(self._data.keys()))
        values = np.array(list(self.data.values()))
        points_pending = np.array(list(self.pending_points))
        points_combined = np.hstack([points_pending, points])

        # Generate neighbors
        self.neighbors = _get_neighbors_from_list(points)
        self.neighbors_combined = _get_neighbors_from_list(points_combined)

        # Update scale
        self._bbox[0] = [points_combined.min(), points_combined.max()]
        self._bbox[1] = [values.min(axis=0), values.max(axis=0)]
        self._scale[0] = self._bbox[0][1] - self._bbox[0][0]
        self._scale[1] = np.max(self._bbox[1][1] - self._bbox[1][0])
        self._oldscale = deepcopy(self._scale)

        # Find the intervals for which the losses should be calculated.
        intervals, intervals_combined = [
            [(x_m, x_r) for x_m, (x_l, x_r) in neighbors.items()][:-1]
            for neighbors in (self.neighbors, self.neighbors_combined)
        ]

        # The the losses for the "real" intervals.