You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

185 lines
4.9 KiB

10 years ago
10 years ago
10 years ago
10 years ago
10 years ago
10 years ago
10 years ago
10 years ago
10 years ago
10 years ago
10 years ago
10 years ago
10 years ago
10 years ago
10 years ago
10 years ago
10 years ago
10 years ago
  1. # nd2reader
  2. ### About
  3. `nd2reader` is a pure-Python package that reads images produced by NIS Elements 4.0+. It has only been definitively tested on NIS Elements 4.30.02 Build 1053. Support for older versions is planned.
  4. .nd2 files contain images and metadata, which can be split along multiple dimensions: time, fields of view (xy-plane), focus (z-plane), and filter channel.
  5. `nd2reader` produces data in numpy arrays, which makes it trivial to use with the image analysis packages such as `scikit-image` and `OpenCV`.
  6. ### Installation
  7. Dependencies will automatically be installed if you don't have them. That said, for optimal performance, you should
  8. install the following packages before installing nd2reader:
  9. #### Ubuntu
  10. `apt-get install python-numpy python-six` (Python 2.x)
  11. `apt-get install python3-numpy python3-six` (Python 3.x)
  12. #### Other operating systems
  13. These have not been tested yet.
  14. nd2reader is compatible with both Python 2.x and 3.x. I recommend installing using pip:
  15. `pip install nd2reader` (Python 2.x)
  16. `pip3 install nd2reader` (Python 3.x)
  17. ### ND2s
  18. A quick summary of ND2 metadata can be obtained as shown below.
  19. ```python
  20. >>> import nd2reader
  21. >>> nd2 = nd2reader.Nd2("/path/to/my_images.nd2")
  22. >>> nd2
  23. <ND2 /path/to/my_images.nd2>
  24. Created: 2014-11-11 15:59:19
  25. Image size: 1280x800 (HxW)
  26. Image cycles: 636
  27. Channels: '', 'GFP'
  28. Fields of View: 8
  29. Z-Levels: 3
  30. ```
  31. You can also get some metadata about the nd2 programatically:
  32. ```python
  33. >>> nd2.height
  34. 1280
  35. >>> nd2.width
  36. 800
  37. >>> len(nd2)
  38. 30528
  39. ```
  40. ### Images
  41. Every method returns an `Image` object, which contains some metadata about the image as well as the
  42. raw pixel data itself. Images are always a 16-bit grayscale image. The `data` attribute holds the numpy array
  43. with the image data:
  44. ```python
  45. >>> image = nd2[20]
  46. >>> print(image.data)
  47. array([[1894, 1949, 1941, ..., 2104, 2135, 2114],
  48. [1825, 1846, 1848, ..., 1994, 2149, 2064],
  49. [1909, 1820, 1821, ..., 1995, 1952, 2062],
  50. ...,
  51. [3487, 3512, 3594, ..., 3603, 3643, 3492],
  52. [3642, 3475, 3525, ..., 3712, 3682, 3609],
  53. [3687, 3777, 3738, ..., 3784, 3870, 4008]], dtype=uint16)
  54. ```
  55. You can get a quick summary of image data by examining the `Image` object:
  56. ```python
  57. >>> image
  58. <ND2 Image>
  59. 1280x800 (HxW)
  60. Timestamp: 1699.79478134
  61. Field of View: 2
  62. Channel: GFP
  63. Z-Level: 1
  64. ```
  65. Or you can access it programmatically:
  66. ```python
  67. image = nd2[0]
  68. print(image.timestamp)
  69. print(image.field_of_view)
  70. print(image.channel)
  71. print(image.z_level)
  72. ```
  73. Often, you may want to just iterate over each image:
  74. ```python
  75. import nd2reader
  76. nd2 = nd2reader.Nd2("/path/to/my_images.nd2")
  77. for image in nd2:
  78. do_something(image.data)
  79. ```
  80. You can also get an image directly by indexing. Here, we look at the 38th image:
  81. ```python
  82. >>> nd2[37]
  83. <ND2 Image>
  84. 1280x800 (HxW)
  85. Timestamp: 1699.79478134
  86. Field of View: 2
  87. Channel: GFP
  88. Z-Level: 1
  89. ```
  90. Slicing is also supported and is extremely memory efficient, as images are only read when directly accessed:
  91. ```python
  92. my_subset = nd2[50:433]
  93. for image in my_subset:
  94. do_something(image.data)
  95. ```
  96. Step sizes are also accepted:
  97. ```python
  98. for image in nd2[:100:2]:
  99. # gets every other image in the first 100 images
  100. do_something(image.data)
  101. for image in nd2[::-1]:
  102. # iterate backwards over every image, if you're into that kind of thing
  103. do_something(image.data)
  104. ```
  105. ### Image Sets
  106. If you have complicated hierarchical data, it may be easier to use image sets, which groups images together if they
  107. share the same time index (not timestamp!) and field of view:
  108. ```python
  109. import nd2reader
  110. nd2 = nd2reader.Nd2("/path/to/my_complicated_images.nd2")
  111. for image_set in nd2.image_sets:
  112. # you can select images by channel
  113. gfp_image = image_set.get("GFP")
  114. do_something_gfp_related(gfp_image.data)
  115. # you can also specify the z-level. this defaults to 0 if not given
  116. out_of_focus_image = image_set.get("Bright Field", z_level=1)
  117. do_something_out_of_focus_related(out_of_focus_image.data)
  118. ```
  119. To get an image from an image set, you must specify a channel. It defaults to the 0th z-level, so if you have
  120. more than one z-level you will need to specify it when using `get`:
  121. ```python
  122. image = image_set.get("YFP")
  123. image = image_set.get("YFP", z_level=2)
  124. ```
  125. You can also see how many images are in your image set:
  126. ```python
  127. >>> len(image_set)
  128. 7
  129. ```
  130. ### Protips
  131. nd2reader is about 14 times faster under Python 3.4 compared to Python 2.7. If you know why, please get in touch!
  132. ### Bug Reports and Features
  133. If this fails to work exactly as expected, please open a Github issue. If you get an unhandled exception, please
  134. paste the entire stack trace into the issue as well.
  135. ### Contributing
  136. Please feel free to submit a pull request with any new features you think would be useful. You can also create an
  137. issue if you'd just like to propose or discuss a potential idea.
  138. ### Acknowledgments
  139. Support for the development of this package was provided by the [Finkelstein Laboratory](http://finkelsteinlab.org/).