These functions acess the core functionality of trackpy:
locate(raw_image, diameter[, minmass, ...]) | Locate Gaussian-like blobs of some approximate size in an image. |
batch(frames, diameter[, minmass, maxsize, ...]) | Locate Gaussian-like blobs of some approximate size in a set of images. |
link_df(features, search_range[, memory, ...]) | Link features into trajectories, assigning a label to each trajectory. |
link_df_iter(features, search_range[, ...]) | Link features into trajectories, assigning a label to each trajectory. |
link_df() and link_df_iter() run the same underlying code, but link_df_iter() streams through large data sets one frame at a time. See the tutorial on large data sets for more.
imsd(traj, mpp, fps[, max_lagtime, ...]) | Compute the mean squared displacement of each particle. |
emsd(traj, mpp, fps[, max_lagtime, detail, ...]) | Compute the ensemble mean squared displacements of many particles. |
compute_drift(traj[, smoothing, pos_columns]) | Return the ensemble drift, x(t). |
subtract_drift(traj[, drift]) | Return a copy of particle trajectores with the overall drift subtracted out. |
vanhove(pos, lagtime[, mpp, ensemble, bins]) | Compute the van Hove correlation (histogram of displacements). |
relate_frames(t, frame1, frame2[, pos_columns]) | Find the displacement vector of all particles between two frames. |
velocity_corr(t, frame1, frame2) | Compute the velocity correlation between every pair of particles’ displacements. |
direction_corr(t, frame1, frame2) | Compute the cosine between every pair of particles’ displacements. |
proximity(features[, pos_columns]) | Find the distance to each feature’s nearest neighbor. |
is_typical(msds, frame[, lower, upper]) | Identify which paritcles’ MSDs are in the central quantile. |
diagonal_size(single_trajectory[, ...]) | Measure the diagonal size of a trajectory. |
filter_stubs(tracks[, threshold]) | Filter out trajectories with few points. |
filter_clusters(tracks[, quantile, threshold]) | Filter out trajectories with a mean particle size above a given quantile. |
Trackpy extends the Crocker–Grier algoritm using a prediction framework, described in the prediction tutorial.
predict.NullPredict | Predict that particles will not move. |
predict.ChannelPredict(bin_size[, ...]) | Predict a particle’s position based on its spanwise coordinate in a channel. |
predict.DriftPredict([initial_guess, span]) | Predict a particle’s position based on the mean velocity of all particles. |
predict.NearestVelocityPredict([...]) | Predict a particle’s position based on the most recent nearby velocity. |
predict.predictor(predict_func) | Decorator to vectorize a predictor function for a single particle. |
predict.instrumented([limit]) | Decorate a predictor class and allow it to record inputs and outputs. |
Trackpy includes functions for plotting the data in ways that are commonly useful. If you don’t find what you need here, you can plot the data any way you like using matplotlib, seaborn, or any other plotting library.
annotate(centroids, image[, circle_size, ...]) | Mark identified features with white circles. |
scatter | |
plot_traj(traj[, colorby, mpp, label, ...]) | Plot traces of trajectories for each particle. |
annotate3d(centroids, image, **kwargs) | Annotates a 3D image and returns a scrollable stack for display in IPython. |
scatter3d | |
plot_traj3d | |
plot_displacements(t, frame1, frame2[, ...]) | Plot arrows showing particles displacements between two frames. |
subpx_bias(f[, pos_columns]) | Histogram the fractional part of the x and y position. |
plot_density_profile |
These two are almost too simple to justify their existence – just a convenient shorthand for a common plotting task.
mass_ecc(f[, ax]) | Plot each particle’s mass versus eccentricity. |
mass_size(f[, ax]) | Plot each particle’s mass versus size. |
By default, locate() and batch() apply a bandpass and a percentile-based threshold to the image(s) before finding features. (You can turn off this functionality using preprocess=False, percentile=0.) In many cases, the default bandpass, which guesses good length scales from the diameter parameter, “just works.” But if you want to executre these steps manually, you can.
bandpass(image, lshort, llong[, threshold, ...]) | Remove noise and background variation. |
percentile_threshold(image, percentile) | Find grayscale threshold based on distribution in image. |
Trackpy implements a generic interface that could be used to store and retrieve particle tracking data in any file format. We hope that it can make it easier for researchers who use different file formats to exchange data. Any in-house format could be accessed using the same simple interface in trackpy.
At present, the interface is implemented only for HDF5 files. There are several different implementations, each with different performance optimizations. PandasHDFStoreBig is a good general-purpose choice.
PandasHDFStore(filename[, mode, t_column]) | An interface to an HDF5 file with framewise access, using pandas. |
PandasHDFStoreBig(filename[, mode, t_column]) | Like PandasHDFStore, but keeps a cache of frame numbers. |
PandasHDFStoreSingleNode(filename[, key, ...]) | An interface to an HDF5 file with framewise access, using pandas, that is faster for cross-frame queries. |
FramewiseData | Abstract base class defining a data container with framewise access. |
That last class cannot be used directly; it is meant to be subclassed to support other formats. See Writing Your Own Interface in the streaming tutorial for more.
Trackpy issues log messages. This functionality is mainly used to report the progress of lengthy jobs, but it may be used in the future to report details of feature-finding and linking for debugging purposes.
When trackpy is imported, it automatically calls handle_logging(), which sets the logging level and attaches a logging handler that plays nicely with IPython notebooks. You can override this by calling ignore_logging() and configuring the logger however you like.
minmass_version_change(raw_image, old_minmass) | Convert minmass value from v0.2.4 to v0.3. |
utils.fit_powerlaw(data[, plot]) | Fit a powerlaw by doing a linear regression in log space. |
strip_diagnostics(tracks) | Remove diagnostic information from a tracks DataFrame. |
diag.performance_report() | Display summary of which optional speedups are installed/enabled |
diag.dependencies() | Give the version of each of the dependencies – useful for bug reports. |
Trackpy implements the most intensive (read: slowest) parts of the core feature-finding and linking algorithm in pure Python (with numpy) and also in numba, which accelerates Python code. Numba can offer a major performance boost, but it is still relatively new, and it can be challenging to use. If numba is available, trackpy will use the numba implementation by default; otherwise, it will use pure Python. The following functions allow sophisticated users to manually switch between numba and pure-Python modes. This may be used, for example, to measure the performance of these two implementations on your data.
enable_numba() | Use numba-accelerated variants of core functions. |
disable_numba() | Do not use numba-accelerated functions, even if numba is available. |
The key steps of the feature-finding algorithm are implemented as separate, modular functions. You can run them in sequence to inspect intermediate steps, or you can use them to roll your own variation on the algorithm.
local_maxima(image, radius[, percentile, margin]) | Find local maxima whose brightness is above a given percentile. |
refine(raw_image, image, radius, coords[, ...]) | Find the center of mass of a bright feature starting from an estimate. |
estimate_mass(image, radius, coord) | Compute the total brightness in the neighborhood of a local maximum. |
estimate_size(image, radius, coord, ...) | Compute the total brightness in the neighborhood of a local maximum. |
All of the linking functions in trackpy provide the same level of control over the linking algorithm itself. For almost all users, the functions above will be sufficient. But link_df() and link_df_iter() above do assume that the data is stored in a pandas DataFrame. For users who want to use some other iterable data structure, the functions below provide direct access to the linking code.
link_iter(levels, search_range[, memory, ...]) | Link features into trajectories, assigning a label to each trajectory. |
link(levels, search_range, hash_generator[, ...]) | Link features into trajectories, assigning a label to each trajectory. |
And the following classes can be subclassed to implement a customized linking procedure.
Point() | Base class for point (features) used in tracking. |
PointND(t, pos[, id]) | Version of Point for tracking in flat space with non-periodic boundary conditions. |
Track([point]) | Base class for objects to represent linked tracks. |
TrackUnstored([point]) | Base class for objects to represent linked tracks. |
HashTable(dims, box_size) | Basic hash table for fast look up of particles in neighborhood. |
SubnetOversizeException | An Exception to be raised when the sub-nets are too big to be efficiently linked. |
These functions may also be useful for rolling your own algorithms:
masks.binary_mask | Elliptical mask in a rectangular array |
masks.r_squared_mask | Mask with values r^2 inside radius and 0 outside |
masks.cosmask | Sin of theta_mask |
masks.sinmask | Sin of theta_mask |
masks.theta_mask | Mask of values giving angular position relative to center. |