Skip to main content
Advertisement

Main menu

  • Home
  • Articles
    • Accepted manuscripts
    • Issue in progress
    • Latest complete issue
    • Issue archive
    • Archive by article type
    • Special issues
    • Subject collections
    • Interviews
    • Sign up for alerts
  • About us
    • About JEB
    • Editors and Board
    • Editor biographies
    • Travelling Fellowships
    • Grants and funding
    • Journal Meetings
    • Workshops
    • The Company of Biologists
    • Journal news
  • For authors
    • Submit a manuscript
    • Aims and scope
    • Presubmission enquiries
    • Article types
    • Manuscript preparation
    • Cover suggestions
    • Editorial process
    • Promoting your paper
    • Open Access
    • Outstanding paper prize
    • Biology Open transfer
  • Journal info
    • Journal policies
    • Rights and permissions
    • Media policies
    • Reviewer guide
    • Sign up for alerts
  • Contacts
    • Contact JEB
    • Subscriptions
    • Advertising
    • Feedback
    • For library administrators
  • COB
    • About The Company of Biologists
    • Development
    • Journal of Cell Science
    • Journal of Experimental Biology
    • Disease Models & Mechanisms
    • Biology Open

User menu

  • Log in

Search

  • Advanced search
Journal of Experimental Biology
  • COB
    • About The Company of Biologists
    • Development
    • Journal of Cell Science
    • Journal of Experimental Biology
    • Disease Models & Mechanisms
    • Biology Open

supporting biologistsinspiring biology

Journal of Experimental Biology

  • Log in
Advanced search

RSS  Twitter  Facebook  YouTube  

  • Home
  • Articles
    • Accepted manuscripts
    • Issue in progress
    • Latest complete issue
    • Issue archive
    • Archive by article type
    • Special issues
    • Subject collections
    • Interviews
    • Sign up for alerts
  • About us
    • About JEB
    • Editors and Board
    • Editor biographies
    • Travelling Fellowships
    • Grants and funding
    • Journal Meetings
    • Workshops
    • The Company of Biologists
    • Journal news
  • For authors
    • Submit a manuscript
    • Aims and scope
    • Presubmission enquiries
    • Article types
    • Manuscript preparation
    • Cover suggestions
    • Editorial process
    • Promoting your paper
    • Open Access
    • Outstanding paper prize
    • Biology Open transfer
  • Journal info
    • Journal policies
    • Rights and permissions
    • Media policies
    • Reviewer guide
    • Sign up for alerts
  • Contacts
    • Contact JEB
    • Subscriptions
    • Advertising
    • Feedback
    • For library administrators
METHODS & TECHNIQUES
High-speed surface reconstruction of a flying bird using structured light
Marc E. Deetjen, Andrew A. Biewener, David Lentink
Journal of Experimental Biology 2017 220: 1956-1961; doi: 10.1242/jeb.149708
Marc E. Deetjen
1Department of Mechanical Engineering, Stanford University, Palo Alto, CA 94305, USA
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Marc E. Deetjen
  • For correspondence: mdeetjen@stanford.edu
Andrew A. Biewener
2Harvard University, Department of Organismic and Evolutionary Biology, Cambridge, MA 02138, USA
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
David Lentink
1Department of Mechanical Engineering, Stanford University, Palo Alto, CA 94305, USA
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Figures & tables
  • Supp info
  • Info & metrics
  • PDF + SI
  • PDF
Loading

ABSTRACT

Birds fly effectively and maneuver nimbly by dynamically changing the shape of their wings during each wingbeat. These shape changes have yet to be quantified automatically at high temporal and spatial resolution. Therefore, we developed a custom 3D surface reconstruction method, which uses a high-speed camera to identify spatially encoded binary striped patterns that are projected on a flying bird. This non-invasive structured-light method allows automated 3D reconstruction of each stand-alone frame and can be extended to multiple views. We demonstrate this new technique by automatically reconstructing the dorsal surface of a parrotlet wing at 3200 frames s−1 during flapping flight. From this shape we analyze key parameters such as wing twist and angle of attack distribution. While our binary ‘single-shot’ algorithm is demonstrated by quantifying dynamic shape changes of a flying bird, it is generally applicable to moving animals, plants and deforming objects.

INTRODUCTION

All flying animals rely to some extent on dynamic body-shape changes to propel themselves. Insects rely predominantly on passive wing morphing through aero-elastic wing deformation (Combes and Daniel, 2003; Wootton, 1992). Bats can actively change the shape of their wings through musculoskeletal control and muscle fibers in their membrane (Cheney et al., 2014). Amongst active flyers, birds can morph their wings to the greatest extent, from fully extended to completely folded in flight (Pennycuick, 2008; Williams and Biewener, 2015), but how they utilize morphing during flapping and maneuvering flight is not fully understood. Such questions have traditionally been addressed by measuring the 3D body kinematics of flying animals using semi-automated marker tracking (Hedrick, 2008; Hedrick et al., 2002; Ros et al., 2015; Tobalske et al., 2007), feature tracking (Biesel et al., 1985; Carruthers et al., 2010; Walker et al., 2009) or visual-hull based reconstruction methods (Fontaine et al., 2009; Muijres et al., 2014; Ristroph et al., 2009). However, none of these methods can directly and automatically reconstruct the wing surface at high resolution.

Structured-light-based methods record the deformation of a projected light pattern due to the animal's surface geometry for offline 3D reconstruction (Fig. 1A), generally by using one of two different pattern encoding techniques. Temporally encoded projection patterns require comparison of consecutive frames. Previous studies have shown that slowly moving human body parts and internal organs can be reconstructed using binary (Ackerman et al., 2002; McKeon and Flynn, 2010) and phase-shifted (Lohry and Zhang, 2014; Wang et al., 2013) temporal coding. During pilot experiments, we determined that this method is too slow to be automated for bird flight. Spatially encoded projection patterns can reconstruct a sequence of stand-alone frames and are hence called ‘single-shot’ (Salvi et al., 2010; Zhang, 2012), which gives the advantage of being robust to inter-frame movement. Some existing spatially encoded structured-light methods rely on binary pseudo-random dots but either have relatively low frame rate and accuracy (Saberioon and Cisar, 2016; Sarbolandi et al., 2015) or require manual digitizing of numerous points per frame (Wolf and Konrath, 2015; Zhang et al., 2008). Other existing spatial methods use grayscale patterns which cannot be projected at high frame rates (Guan et al., 2003; Lenar et al., 2013; Sagawa et al., 2012; Su and Liu, 2006). Because we found that no existing system can automatically measure dynamic shape changes at sufficiently high speeds, we developed a custom method. This new single-shot structured-light technique can automatically resolve body shape changes at high temporal and spatial resolution.

MATERIALS AND METHODS

High-speed 3D surface reconstruction experimental setup

The experimental setup (Fig. 1A) consisted of a 3D calibrated and synchronized high-speed camera (Phantom Miro M310; Vision Research, Wayne, NJ, USA) and high-speed projector (DLP® LightCrafter™ E4500MKII™; EKB Technologies, Bat-Yam, Israel) operating at 3200 frames s−1. Calibration of the system was achieved using a modified version of the camera calibration toolbox for MATLAB (http://www.vision.caltech.edu/bouguetj/calib_doc/). All data processing was conducted in MATLAB R2015b. We analyzed the first two wingbeats after take-off of a 4-year-old female near-white Pacific parrotlet [Forpus coelestis (Lesson 1847), 27–29 g, 0.23 m wingspan], which was trained using positive reinforcement to fly between perches 0.5 m apart. All experiments were in accordance with Stanford University's Institutional Animal Care and Use Committee.

Fig. 1.
  • Download figure
  • Open in new tab
  • Download powerpoint
Fig. 1.

We developed a new high-speed 3D surface reconstruction technique for rapidly locomoting animals based on binary spatially encoded structured light. (A) The 3D calibrated projection of a structured-light pattern on a flying bird is imaged by a high-speed camera. (B) The projected pattern is shown in black and white, while the color red is used to indicate the known lines and numbers of horizontally projected stripes (we show a side view of horizontal projection planes). Horizontal stripes are spaced unequally to ensure local uniqueness of the pattern, while vertical stripes are equally spaced for high spatial resolution. (C) The images captured by the camera with unknown horizontal stripes are labeled in blue. (D,E) Ordered but unidentified horizontal stripes (blue letters and lines) are matched with an ordered subset of known projection stripes (red numbers and lines). In D, the blue and red lines do not match, whereas in E they match. (F) Horizontal stripe matching error as a function of matching permutation, including the two boxed examples shown in D and E. The error is computed as the mean squared angle in radians between matching stripes. Note that the stripe numbering and lettering convention in this figure do not match the equations given in the text, they serve an illustrative purpose only.

Design of the single-shot structured-light pattern

To achieve a binary single-shot light pattern for high-speed surface reconstruction, we modified a single-shot structured-light technique (Kawasaki et al., 2008). The projected pattern consists of horizontal and vertical stripes that form a grid, and in the original method, different colors were used to distinguish the horizontal and vertical stripes. We simplified this approach for use at high speed by removing the color coding, instead relying on image processing to distinguish binary stripes (Fig. 1B). Vertical stripes are equally spaced and densely packed for high spatial resolution, while horizontal stripes are unequally spaced to ensure local uniqueness. To enhance robustness, the unequally spaced horizontal stripes can be spaced such that the spacing between every set of four consecutive stripes is unique.

There are four key advantages of using this scheme. First, it is designed for full automation, which allows for high throughput of data. Second, it is single shot, which is robust for rapidly deforming objects. Third, the scheme uses binary light patterns, which allow the projectors to use their maximum frame rate. Fourth, it uses a single color that allows maximum frame rate and multiple view angles. Interference can be avoided by using different color channels, light polarization or slightly out-of-phase light pulses.

Image processing for identifying and ordering stripes

Before 3D reconstructing the surface, image processing was required to separate the vertical and horizontal stripes in the camera image, order these stripes and find their intersections. We applied the following fully automated steps (Fig. S1). The camera image was rotated to align the equally spaced stripes vertically. Next, the Laplacian of a directional Gaussian filter was applied in the horizontal and vertical directions. Adaptive thresholding was used to generate a noisy approximation of the horizontal and vertical stripes. The noise was filtered out by adding a minimum length requirement for each connected white region. Extension and closure of stripes with gaps was accomplished by choosing paths that best combine attributes of high pixel brightness and correct stripe direction. These factors were weighted, and stripes were only extended if a preset cut-off value was satisfied.

After all stripes were identified, their subpixel center was determined using the original image by quadratically fitting brightness levels perpendicular to the stripe, based on which intersection points between horizontal and vertical stripes were located. Regions near intersections produced inaccurate center lines, so these regions were interpolated and the intersections recomputed. Finally, all stripes were ordered based on connections between stripes, and discontinuous grids were ordered separately.

Stripe matching algorithm

To triangulate light on the bird's surface, the unknown 3D planes visible from the camera needed to be matched with known 3D light planes generated by the projector (Fig. 1B,C). The algorithm described here used one camera and projector, but the same steps can be followed for multiple calibrated cameras and projectors. After completing the image processing steps, variables were organized with lowercase letters referring to unknown planes and uppercase letters referring to known planes. In the projected pattern, there was an ordered set of M known vertical planes with the Kth unit normal <AK,BK,CK> and an ordered set of N known horizontal planes with the Lth unit normal <DL,EL,FL>. From the camera image, there was a set of m unknown vertical planes with the kth unit normal <ak,bk,ck> and n unknown horizontal planes with the lth unit normal <dl,el,fl>. The order of the unknown planes and their 2D intersection points (xkl, ykl), however, was known as long as the grid connected the stripes (minimum four connections required). If there were discontinuities in the grid, separate portions were computed separately. Calibration of the camera and projector produced Eqns 1–3: Embedded Image (1) Embedded Image (2) Embedded Image (3)

where Pc/p are 2D homogeneous camera or projector (c/p) coordinates in pixels, Pw are 3D coordinates, Kc/p is the internal calibration matrix defined by Eqn 3 (with constants α, β, u0 and v0), R is the rotation matrix from the camera to the projector and T is the translation matrix from the camera to the projector (where the optical center of the camera lies at the origin). While it was unknown to which projection plane each plane in the camera image corresponded, two equations could be written per camera plane based on the calibration above, Eqns 4 and 5. They were derived based on the principle that all the planes intersected at the optical center of the projector. Eqn 6 then followed from all vertical planes intersecting at a vector in space, while Eqn 7 is the equivalent for all horizontal planes. Embedded Image (4) Embedded Image (5) Embedded Image (6) Embedded Image (7)

Brackets used in Eqns 4–7 indicate the selection of a specific [Row, Column] value in matrices.

For each intersection of horizontal plane, vertical plane and the bird, the calibration was used to write Eqns 8 and 9, where x, y and z defined the unknown 3D location of the intersection point. Further, we knew which two planes intersected at this point, so we wrote two more equations defining each plane (not shown). These four equations combined into a single equation by eliminating x, y and z (Eqn 10): Embedded Image (8) Embedded Image (9) Embedded Image (10)

With all known and unknown variables defined and constrained, known and unknown planes could be matched. This was done by ‘sliding’ the unknown planes onto different known plane positions to determine which position results in a minimal matching error, as visualized in Fig. 1D–F. The mathematical analog to this combines Eqns 4–7 and 10 into one large matrix as seen in Eqn 11: Embedded Image (11)

where M and B are constant matrices and X contains the sought-after unit normal vectors for all ordered horizontal and vertical unknown planes. Because X has one degree of freedom, it can be rewritten as: Embedded Image (12)

where Xp is a particular solution of X, Xh is the homogeneous solution of X, and p is a variable. The value of p is critical as it determines the particular solution of X and can be tuned to slide the unknown planes to different positions to reduce matching errors. Singular value decomposition was then used as defined in Eqn 13, where σ are the singular values and U and V are square matrices, to find Xp in Eqn 14 and Xh, which is the rightmost column of V: Embedded Image (13) Embedded Image (14)

For each potential match between known and unknown planes, the error was computed as the mean squared angle between the known and unknown planes. The correct matching sequence for the horizontal planes gave a much lower error than other possible matches due to the unequal spacing of the stripes. When the correct matching planes were found for the horizontal planes, the value of p used in Eqn 12 was then used to match the vertical planes as well.

3D surface reconstruction and system accuracy

After the unknown planes were matched with the projected planes, 3D reconstruction of the surface was straightforward. For each stripe seen on the bird, its light plane was defined. Additionally, for each point at the center of a stripe, the 3D vector along which that point must lie was specified. The intersection of the vector and plane lies on the bird's surface. We then fit a surface (average 26,907 points) to the point cloud of data (average 285 intersections and 11,405 total points) using the Gridfit toolbox in MATLAB (https://www.mathworks.com/matlabcentral/fileexchange/8998-surface-fitting-using-gridfit), which uses a modified ridge estimator with tunable smoothing. The result is shown in Fig. 2E for different wingbeat phases reconstructed with a single camera and projector.

Fig. 2.
  • Download figure
  • Open in new tab
  • Download powerpoint
Fig. 2.

Single-shot 3D surface reconstruction accuracy and results. (A,B) Method verification tests using a 22.23±0.05 mm (mean±s.d.) radius sphere results in a reconstruction error of 0.31±1.03 mm averaged over sphere location for 400 frames, showing the new method is both accurate and precise. In A, the probability density function (PDF) of the error in the measured radius is shown. (C) To reconstruct a surface, horizontal and vertical stripes are separated using image processing techniques (Fig. S1). Next, all intersections of these stripes are calculated (one example is shown as a green dot). (D) Each stripe seen in the camera represents a 3D projected plane of light and each 2D intersection point represents a vector in 3D space on which the 3D intersection on the bird's surface lies. A sampling of projection planes is shown in black, while single horizontal and vertical planes are extended along with their intersection point on the bird's surfaces. Color coding corresponds to Fig. 2C. (E) 3D surface reconstruction of the flapping wing (and body) of a bird in flight at 2.5 ms intervals (33, 44 and 55% of its first downstroke after take-off).

The reconstruction accuracy achievable was estimated in a separate test with similar equipment settings. Using a sphere of known radius (22.23±0.05 mm, mean±s.d.), we found an accuracy of 0.31 mm (error) and precision of ±1.03 mm (s.d.) (see Fig. 2A,B). Errors were largest (double) in areas outside the calibration volume (the image corner regions). Additionally, occasional stripe mismatching occurred in the image processing steps, which accounts for other larger errors (Fig. S3). When processing both the sphere and bird, no manual manipulation was used and bird reconstruction was successful for 98% of the frames over four separate downstroke segments (two wingbeats of two flights).

Calculation of dorsal surface parameters

To calculate bird-specific surface parameters, a reference frame must be defined for each frame. To accomplish this automatically, we first identified the body of the bird by identifying surface points that move less than a preset threshold distance (see Fig. S2). To compute the z-axis, the body points were fit to a plane, after which the x-axis was computed by finding the line of symmetry of the body points projected on that plane. To find a repeatable origin point, the top of the bird's head was found by fitting a paraboloid to points on the head. For the frames we analyze here, the orientation of the axis did not change significantly and was thus set constant for all frames, while the origin point was fit linearly over all relevant frames. This computed body reference frame was labeled with subscript b. Another reference frame, the wing reference frame, was used for measuring wing shape and was labeled with subscript w. It was found by rotating the body reference frame about the xb-axis in order to best fit the (right) wing of the bird. The reference frames and the corresponding surfaces are shown in Fig. 3A,B.

Fig. 3.
  • Download figure
  • Open in new tab
  • Download powerpoint
Fig. 3.

The dorsal wing profile, twist and aerodynamic angle of attack of a bird is automatically tracked throughout a section of the downstroke for which the wing is fully in view. (A,B) 3D upper surface of a parrotlet at 41 and 65% of its first downstroke after take-off, respectively. Two reference frames are defined as a function of time: a translating body reference frame (subscript b) with zb pointing up and xb pointing forward, and a translating and rotating wing reference frame (subscript w). The origin for both reference frames is the top of the head. The wing reference frame is rotated about xb so it is parallel with the right wing. The r-axis is parallel to yw and begins at the plane labeled r0, which intersects the bird's right shoulder. (C) Side view of bird shown in A, illustrating the definition of geometric, induced and effective angle of attack. (D) The shape of the bird at the centerline of its dorsal surface (yb=0), where the purple line corresponds to A and the yellow line corresponds to B. (E,F) The dorsal profile shape of the bird wing as a function of wing radius, r, equal to 0.25R and 0.50R, where R is the length of the right wing from the plane r0 to the wingtip and is held constant across all frames. The value r is equal to zero at the plane labeled r0. (G) The geometric angle of attack (twist) of the wing with respect to the body of the bird is approximately linear. (H) Induced angle of attack of the wing due to bird forward and wing flapping velocity based on the average spanwise velocity of center of the wing. (I) Effective angle of attack calculated as the geometric angle of attack minus the induced angle of attack.

Using these reference frames and the 3D data, surface metrics were computed. In Fig. 3D, the shape of the bird at its midline is tracked while in Fig. 3E,F, the shapes of dorsal airfoil slices of the wing are tracked at different spanwise positions. We determined the angles of attack spanwise along the wing (see Fig. 3G–I) based on these airfoils. The geometric angle of attack was found with a linear fit to each airfoil, while the induced angle of attack was found by computing the direction of the velocity of the wing from root to tip using the bird velocity and angular velocity of the wing. To reduce noise in these linear fits, outliers were detected using the RANSAC method (Fischler and Bolles, 1981). The effective angle of attack is the difference of the induced and geometric angles of attack. Because of the angle of the bird's wing relative to the camera and projector positions, angle of attack measurements beyond a spanwise position halfway to the wingtip are less reliable. This could be resolved in future setups by adding cameras and projectors to better image the wing under these angles.

RESULTS AND DISCUSSION

We quantified and analyzed the 3D wing and body shape of a flying bird during a portion of four downstrokes (41% to 65% on the first downstroke and 36% to 64% on the second downstroke after take-off) using a novel high-speed, automated, 3D structured-light method. While our results are for a single projector and camera combination imaging the dorsal side of the bird, more cameras and projectors can be added to obtain full body volumetric reconstruction and provide a more complete analysis of wing morphing over an entire wingbeat cycle. In our analysis of the dorsal data, we found that the bird's tail rotates down significantly with respect to the body (1720 deg s−1 in flight 1, 1550 deg s−1 in flight 2) during the first downstroke after take-off (Fig. 3D) but not the second (−290 deg s−1 in flight 1, 270 deg s−1 in flight 2) (Fig. S3; all angular velocities are averaged over the downstroke phase analyzed). The wings rotate down at an average of 5700 deg s−1 in this same portion of the downstroke. Further, the wings are tracked at different spanwise positions (Fig. 3E,F), and we see that the wing twists relative to the body through the downstroke as confirmed by the geometric angle of attack plot (Fig. 3G). Using these data, we computed the effective aerodynamic angle of attack (Fig. 3I), which remains relatively constant between 50 and 70 deg in the first downstroke and 45 and 60 deg in the second downstroke (Fig. S3C,F). These high angles during the downstroke at take-off enable the bird to support its body weight with both lift and drag, while simultaneously generating significant thrust by tilting lift forward. This is facilitated by the close to horizontal orientation of the wing surface moving predominantly downward (Fig. 3A,B) combined with the high lift and drag coefficients of a bird wing at these high angles of attack (Kruyt et al., 2014). The measurements illustrate how the system provides insight into how birds morph their wings to generate aerodynamic forces, and could give insight into maneuvering flight in future studies.

Beyond studying birds, this new structured-light method has many benefits for capturing 3D data in experimental biology and engineering. It is automated to allow for a high throughput of data, single-shot to track deforming objects, binary to allow for high-speed tracking, and uses a single color to allow for multiple view angles. While the current application of this technique is for studying bird flight, it is broadly adaptable for tracking shapes of other dynamically morphing animals, plants and objects of interest.

Acknowledgements

We thank J. W. Kruyt for initially helping us develop and test temporally encoded structured-light methods, and A. K. Stowers for reviewing the mathematics.

FOOTNOTES

  • Competing interests

    The authors declare no competing or financial interests.

  • Author contributions

    Conceptualization and writing – review and editing: M.E.D., D.L., A.A.B.; Methodology: M.A.D., D.L.; Software, validation, formal analysis, investigation, data curation, writing – original draft preparation, and visualization: M.E.D.; Resources and project administration: D.L., A.A.B.; Supervision: D.L.

  • Funding

    This research was supported by the National Science Foundation (NSF) Graduate Research Fellowship under grant no. DGE-114747 to M.E.D.; NSF grant IOS-0744056 to A.A.B.; and Office of Naval Research MURI grant N00014-10-1-0951, the Micro Autonomous Systems and Technology at the Army Research Laboratory–Collaborative Technology Alliance Center grant MCE-16-17-4.3, and NSF CAREER Award 1552419 to D.L.

  • Supplementary information

    Supplementary information available online at http://jeb.biologists.org/lookup/doi/10.1242/jeb.149708.supplemental

  • Received September 14, 2016.
  • Accepted March 21, 2017.
  • © 2017. Published by The Company of Biologists Ltd
http://www.biologists.com/user-licence-1-1/

References

  1. ↵
    1. Ackerman, J. D.,
    2. Keller, K. and
    3. Fuchs, H.
    (2002). Surface reconstruction of abdominal organs using laparoscopic structured light for augmented reality. Three-Dimensional Image Capture Appl. 4661, 39-46. doi:10.1117/12.460179
    OpenUrlCrossRef
  2. ↵
    1. Biesel, W.,
    2. Butz, H. and
    3. Nachtigall, W.
    (1985). Erste messungen der flogelgeometrie bei frei gleitflie-genden haustauben (Columba livia var. domestica) unter benutzung neu ausgearbeiteter verfahren der windkanal-technik und der stereophotogrammetrie. Biona-Report 3 3, 138-160.
    OpenUrl
  3. ↵
    1. Carruthers, A. C.,
    2. Walker, S. M.,
    3. Thomas, A. L. R. and
    4. Taylor, G. K.
    (2010). Aerodynamics of aerofoil sections measured on a free-flying bird. Proc. Inst. Mech. Eng. Part G J. Aerosp. Eng. 224, 855-864. doi:10.1243/09544100JAERO737
    OpenUrlCrossRef
  4. ↵
    1. Cheney, J. A.,
    2. Konow, N.,
    3. Middleton, K. M.,
    4. Breuer, K. S.,
    5. Roberts, T. J.,
    6. Giblin, E. L. and
    7. Swartz, S. M.
    (2014). Membrane muscle function in the compliant wings of bats. Bioinspir. Biomim. 9, 025007. doi:10.1088/1748-3182/9/2/025007
    OpenUrlCrossRef
  5. ↵
    1. Combes, S. A. and
    2. Daniel, T. L.
    (2003). Into thin air: contributions of aerodynamic and inertial-elastic forces to wing bending in the hawkmoth Manduca sexta. J. Exp. Biol. 206, 2999-3006. doi:10.1242/jeb.00502
    OpenUrlAbstract/FREE Full Text
  6. ↵
    1. Fischler, M. A. and
    2. Bolles, R. C.
    (1981). Random sample consensus: a paradigm for model fitting with applicatlons to image analysis and automated cartography. Commun. ACM 24, 381-395. doi:10.1145/358669.358692
    OpenUrlCrossRefWeb of Science
  7. ↵
    1. Fontaine, E. I.,
    2. Zabala, F.,
    3. Dickinson, M. H. and
    4. Burdick, J. W.
    (2009). Wing and body motion during flight initiation in Drosophila revealed by automated visual tracking. J. Exp. Biol. 212, 1307-1323. doi:10.1242/jeb.025379
    OpenUrlAbstract/FREE Full Text
  8. ↵
    1. Guan, C.,
    2. Hassebrook, L. G. and
    3. Lau, D. L.
    (2003). Composite structured light pattern for three-dimensional video. Opt. Express 11, 406-417. doi:10.1364/OE.11.000406
    OpenUrlCrossRefPubMed
  9. ↵
    1. Hedrick, T. L.
    (2008). Software techniques for two- and three-dimensional kinematic measurements of biological and biomimetic systems. Bioinspir. Biomim. 3, 34001. doi:10.1088/1748-3182/3/3/034001
    OpenUrlCrossRef
  10. ↵
    1. Hedrick, T. L.,
    2. Tobalske, B. W. and
    3. Biewener, A. A.
    (2002). Estimates of circulation and gait change based on a three-dimensional kinematic analysis of flight in cockatiels (Nymphicus hollandicus) and ringed turtle-doves (Streptopelia risoria). J. Exp. Biol. 205, 1389-1409.
    OpenUrlAbstract/FREE Full Text
  11. ↵
    1. Kawasaki, H.,
    2. Furukawa, R.,
    3. Sagawa, R. and
    4. Yagi, Y.
    (2008). Dynamic scene shape reconstruction using a single structured light pattern. In IEEE 2nd Joint 3DIM/3DPVT Conference on Computer Vision and Pattern Recognition, Vols 1-12, pp. 2806-2813.
  12. ↵
    1. Kruyt, J. W.,
    2. Quicazán-Rubio, E. M.,
    3. van Heijst, G. F.,
    4. Altshuler, D. L. and
    5. Lentink, D.
    (2014). Hummingbird wing efficacy depends on aspect ratio and compares with helicopter rotors. J. R. Soc. Interface 11, 570-581. doi:10.1098/rsif.2014.0585
    OpenUrlCrossRef
  13. ↵
    1. Lenar, J.,
    2. Witkowski, M.,
    3. Carbone, V.,
    4. Kolk, S.,
    5. Adamczyk, M.,
    6. Sitnik, R.,
    7. van der Krogt, M. and
    8. Verdonschot, N.
    (2013). Lower body kinematics evaluation based on a multidirectional four-dimensional structured light measurement. J. Biomed. Opt. 18, 056014. doi:10.1117/1.JBO.18.5.056014
    OpenUrlCrossRef
  14. ↵
    1. Lohry, W. and
    2. Zhang, S.
    (2014). High-speed absolute three-dimensional shape measurement using three binary dithered patterns. Opt. Express 22, 26752. doi:10.1364/OE.22.026752
    OpenUrlCrossRef
  15. ↵
    1. McKeon, R. T. and
    2. Flynn, P. J.
    (2010). Three-dimensional facial imaging using a static light screen (SLS) and a dynamic subject. IEEE Trans. Instrum. Meas. 59, 774-783. doi:10.1109/TIM.2009.2037874
    OpenUrlCrossRef
  16. ↵
    1. Muijres, F. T.,
    2. Elzinga, M. J.,
    3. Melis, J. M. and
    4. Dickinson, M. H.
    (2014). Flies evade looming targets by executing rapid visually directed banked turns. Science 344, 172-177. doi:10.1126/science.1248955
    OpenUrlAbstract/FREE Full Text
  17. ↵
    1. Pennycuick, C. J.
    (2008). Modeling the Flying Bird. Burlington, MA: Academic Press.
  18. ↵
    1. Ristroph, L.,
    2. Berman, G. J.,
    3. Bergou, A. J.,
    4. Wang, Z. J. and
    5. Cohen, I.
    (2009). Automated hull reconstruction motion tracking (HRMT) applied to sideways maneuvers of free-flying insects. J. Exp. Biol. 212, 1324-1335. doi:10.1242/jeb.025502
    OpenUrlAbstract/FREE Full Text
  19. ↵
    1. Ros, I. G.,
    2. Badger, M. A.,
    3. Pierson, A. N.,
    4. Bassman, L. C. and
    5. Biewener, A. A.
    (2015). Pigeons produce aerodynamic torques through changes in wing trajectory during low speed aerial turns. J. Exp. Biol. 218, 480-490. doi:10.1242/jeb.104141
    OpenUrlAbstract/FREE Full Text
  20. ↵
    1. Saberioon, M. M. and
    2. Cisar, P.
    (2016). Automated multiple fish tracking in three-dimension using a structured light sensor. Comput. Electron. Agric. 121, 215-221. doi:10.1016/j.compag.2015.12.014
    OpenUrlCrossRef
  21. ↵
    1. Sagawa, R.,
    2. Sakashita, K.,
    3. Kasuya, N.,
    4. Kawasaki, H.,
    5. Furukawa, R. and
    6. Yagi, Y.
    (2012). Grid-based active stereo with single-colored wave pattern for dense one-shot 3D scan. In IEEE International Conference on 3D Imaging, Modeling, Processing, Visualization and Transmission, pp. 363-370. doi:10.1109/3dimpvt.2012.41
    OpenUrlCrossRef
  22. ↵
    1. Salvi, J.,
    2. Fernandez, S.,
    3. Pribanic, T. and
    4. Llado, X.
    (2010). A state of the art in structured light patterns for surface profilometry. Pattern Recognit. 43, 2666-2680. doi:10.1016/j.patcog.2010.03.004
    OpenUrlCrossRefWeb of Science
  23. ↵
    1. Sarbolandi, H.,
    2. Lefloch, D. and
    3. Kolb, A.
    (2015). Kinect range sensing: structured-light versus time-of-flight kinect. Comput. Vis. Image Underst. 139, 1-20. doi:10.1016/j.cviu.2015.05.006
    OpenUrlCrossRef
  24. ↵
    1. Su, W.-H. and
    2. Liu, H.
    (2006). Calibration-based two-frequency projected fringe profilometry: a robust, accurate, and single-shot measurement for objects with large depth discontinuities. Opt. Express 14, 9178-9187. doi:10.1364/OE.14.009178
    OpenUrlCrossRefPubMed
  25. ↵
    1. Tobalske, B. W.,
    2. Warrick, D. R.,
    3. Clark, C. J.,
    4. Powers, D. R.,
    5. Hedrick, T. L.,
    6. Hyder, G. A. and
    7. Biewener, A. A.
    (2007). Three-dimensional kinematics of hummingbird flight. J. Exp. Biol. 210, 2368-2382. doi:10.1242/jeb.005686
    OpenUrlAbstract/FREE Full Text
  26. ↵
    1. Walker, S. M.,
    2. Thomas, A. L. R. and
    3. Taylor, G. K.
    (2009). Deformable wing kinematics in the desert locust: how and why do camber, twist and topography vary through the stroke? J. R. Soc. Interface 6, 735-747. doi:10.1098/rsif.2008.0435
    OpenUrlAbstract/FREE Full Text
  27. ↵
    1. Wang, Y.,
    2. Laughner, J. I.,
    3. Efimov, I. R. and
    4. Zhang, S.
    (2013). 3D absolute shape measurement of live rabbit hearts with a superfast two-frequency phase-shifting technique. Opt. Express 21, 6631-6636. doi:10.1364/oe.21.005822
    OpenUrlCrossRef
  28. ↵
    1. Williams, C. D. and
    2. Biewener, A. A.
    (2015). Pigeons trade efficiency for stability in response to level of challenge during confined flight. Proc. Natl. Acad. Sci. USA 2015, 201407298. doi:10.1073/pnas.1407298112
    OpenUrlAbstract/FREE Full Text
  29. ↵
    1. Wolf, T. and
    2. Konrath, R.
    (2015). Avian wing geometry and kinematics of a free-flying barn owl in flapping flight. Exp. Fluids 56, 28. doi:10.1007/s00348-015-1898-6
    OpenUrlCrossRef
  30. ↵
    1. Wootton, R. J.
    (1992). Functional morphology of insect wings. Entomol 113-140. doi:10.1146/annurev.en.37.010192.000553
    OpenUrlCrossRefWeb of Science
  31. ↵
    1. Zhang, Z. H.
    (2012). Review of single-shot 3D shape measurement by phase calculation-based fringe projection techniques. Opt. Lasers Eng. 50, 1097-1106. doi:10.1016/j.optlaseng.2012.01.007
    OpenUrlCrossRef
  32. ↵
    1. Zhang, G.,
    2. Sun, J.,
    3. Chen, D. and
    4. Wang, Y.
    (2008). Flapping motion measurement of honeybee bilateral wings using four virtual structured-light sensors. Sensors Actuators A Phys. 148, 19-27. doi:10.1016/j.sna.2008.06.025
    OpenUrlCrossRefWeb of Science
Previous ArticleNext Article
Back to top
Previous ArticleNext Article

This Issue

Keywords

  • Animal locomotion
  • High speed
  • Single shot
  • Structured light
  • Surface reconstruction
  • Wing morphing

 Download PDF

Email

Thank you for your interest in spreading the word on Journal of Experimental Biology.

NOTE: We only request your email address so that the person you are recommending the page to knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

Enter multiple addresses on separate lines or separate them with commas.
High-speed surface reconstruction of a flying bird using structured light
(Your Name) has sent you a message from Journal of Experimental Biology
(Your Name) thought you would like to see the Journal of Experimental Biology web site.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Share
METHODS & TECHNIQUES
High-speed surface reconstruction of a flying bird using structured light
Marc E. Deetjen, Andrew A. Biewener, David Lentink
Journal of Experimental Biology 2017 220: 1956-1961; doi: 10.1242/jeb.149708
del.icio.us logo Digg logo Reddit logo Twitter logo CiteULike logo Facebook logo Google logo Mendeley logo
Citation Tools
METHODS & TECHNIQUES
High-speed surface reconstruction of a flying bird using structured light
Marc E. Deetjen, Andrew A. Biewener, David Lentink
Journal of Experimental Biology 2017 220: 1956-1961; doi: 10.1242/jeb.149708

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Alerts

Please log in to add an alert for this article.

Sign in to email alerts with your email address

Article navigation

  • Top
  • Article
    • ABSTRACT
    • INTRODUCTION
    • MATERIALS AND METHODS
    • RESULTS AND DISCUSSION
    • Acknowledgements
    • FOOTNOTES
    • References
  • Figures & tables
  • Supp info
  • Info & metrics
  • PDF + SI
  • PDF

Related articles

Cited by...

More in this TOC section

  • Continuous body 3-D reconstruction of limbless animals
  • Rhythmic auditory stimuli modulate movement recovery in response to perturbation during locomotion
  • Estimation of sinking velocity using free-falling dynamically scaled models: Foraminifera as a test case
Show more METHODS & TECHNIQUES

Similar articles

Other journals from The Company of Biologists

Development

Journal of Cell Science

Disease Models & Mechanisms

Biology Open

Advertisement

Predicting the Future: Species Survival in a Changing World

Read our new special issue exploring the significant role of experimental biology in assessing and predicting the susceptibility or resilience of species to future, human-induced environmental change.


Big Biology Podcast - Hollie Putnam and coral bleaching

Catch the next JEB-sponsored episode of the Big Biology Podcast where Art and Marty talk to Hollie Putnam about the causes of coral bleaching and the basic biology of corals in the hope of selectively breeding corals that can better tolerate future ocean conditions.

Read Hollie's Review on the subject, which is featured in our current special issue. 


Stark trade-offs and elegant solutions in arthropod visual systems

Many elegant eye specializations that evolved in response to visual challenges continue to be discovered. A new Review by Meece et al. summarises exciting solutions evolved by insects and other arthropods in response to specific visual challenges.


Head bobbing gives pigeons a sense of perspective

Pigeons might look goofy with their head-bobbing walk, but it turns out that the ungainly head manoeuvre allows the birds to judge distance.

Articles

  • Accepted manuscripts
  • Issue in progress
  • Latest complete issue
  • Issue archive
  • Archive by article type
  • Special issues
  • Subject collections
  • Interviews
  • Sign up for alerts

About us

  • About JEB
  • Editors and Board
  • Editor biographies
  • Travelling Fellowships
  • Grants and funding
  • Journal Meetings
  • Workshops
  • The Company of Biologists
  • Journal news

For Authors

  • Submit a manuscript
  • Aims and scope
  • Presubmission enquiries
  • Article types
  • Manuscript preparation
  • Cover suggestions
  • Editorial process
  • Promoting your paper
  • Open Access
  • Outstanding paper prize
  • Biology Open transfer

Journal Info

  • Journal policies
  • Rights and permissions
  • Media policies
  • Reviewer guide
  • Sign up for alerts

Contact

  • Contact JEB
  • Subscriptions
  • Advertising
  • Feedback

 Twitter   YouTube   LinkedIn

© 2021   The Company of Biologists Ltd   Registered Charity 277992