Search

Edward N Bachelder

age ~58

from Manhattan Beach, CA

Also known as:
  • Edward Nelson Bachelder
  • Edward Bachelder
  • Ed Bachelder
128 40Th St, Manhattan Beach, CA 90266310-5457260

Edward Bachelder Phones & Addresses

  • 128 40Th St, Manhattan Bch, CA 90266 • 310-5457260
  • Manhattan Beach, CA
  • 615 Catalina Ave, Redondo Beach, CA 90277
  • San Diego, CA
  • 260 Beacon St, Boston, MA 02116
  • Brighton, MA

Work

  • Position:
    Professional/Technical

Education

  • Degree:
    Associate degree or higher

Us Patents

  • Autorotation Flight Control System

    view source
  • US Patent:
    7976310, Jul 12, 2011
  • Filed:
    Jan 13, 2006
  • Appl. No.:
    11/332078
  • Inventors:
    Edward N. Bachelder - Redondo Beach CA,
    Bimal L. Aponso - Rancho Palos Verdes CA,
  • Assignee:
    Systems Technology, Inc. - Hawthorne CA
  • International Classification:
    G09B 9/08
  • US Classification:
    434 33
  • Abstract:
    The present invention provides computer implemented methodology that permits the safe landing and recovery of rotorcraft following engine failure. With this invention successful autorotations may be performed from well within the unsafe operating area of the height-velocity profile of a helicopter by employing the fast and robust real-time trajectory optimization algorithm that commands control motion through an intuitive pilot display, or directly in the case of autonomous rotorcraft. The algorithm generates optimal trajectories and control commands via the direct-collocation optimization method, solved using a nonlinear programming problem solver. The control inputs computed are collective pitch and aircraft pitch, which are easily tracked and manipulated by the pilot or converted to control actuator commands for automated operation during autorotation in the case of an autonomous rotorcraft. The formulation of the optimal control problem has been carefully tailored so the solutions resemble those of an expert pilot, accounting for the performance limitations of the rotorcraft and safety concerns.
  • Systems And Methods For Combining Virtual And Real-Time Physical Environments

    view source
  • US Patent:
    8040361, Oct 18, 2011
  • Filed:
    Jan 20, 2009
  • Appl. No.:
    12/321329
  • Inventors:
    Edward N. Bachelder - Redondo Beach CA,
    Noah Brickman - Ben Lomond CA,
  • Assignee:
    Systems Technology, Inc. - Hawthorne CA
  • International Classification:
    G09G 5/00
  • US Classification:
    345633, 345592
  • Abstract:
    Systems, methods and structures for combining virtual reality and real-time environment by combining captured real-time video data and real-time 3D environment renderings to create a fused, that is, combined environment, including capturing video imagery in RGB or HSV/HSV color coordinate systems and processing it to determine which areas should be made transparent, or have other color modifications made, based on sensed cultural features, electromagnetic spectrum values, and/or sensor line-of-sight, wherein the sensed features can also include electromagnetic radiation characteristics such as color, infra-red, ultra-violet light values, cultural features can include patterns of these characteristics, such as object recognition using edge detection, and whereby the processed image is then overlaid on, and fused into a 3D environment to combine the two data sources into a single scene to thereby create an effect whereby a user can look through predesignated areas or “windows” in the video image to see into a 3D simulated world, and/or see other enhanced or reprocessed features of the captured image.
  • System For Combining Virtual And Real-Time Environments

    view source
  • US Patent:
    2007003, Feb 15, 2007
  • Filed:
    Apr 11, 2005
  • Appl. No.:
    11/104379
  • Inventors:
    Edward Bachelder - Redondo Beach CA,
    Noah Brickman - Topanga CA,
  • International Classification:
    G09G 5/00
  • US Classification:
    345633000
  • Abstract:
    The present invention relates to a method and an apparatus for combining virtual reality and real-time environment. The present invention provides a system that combines captured real-time video data and real-time 3D environment rendering to create a fused (combined) environment. The system captures video imagery and processes it to determine which areas should be made transparent (or have other color modifications made), based on sensed cultural features and/or sensor line-of-sight. Sensed features can include electromagnetic radiation characteristics (i.e. color, infra-red, ultra-violet light). Cultural features can include patterns of these characteristics (i.e. object recognition using edge detection). This processed image is then overlaid on a 3D environment to combine the two data sources into a single scene. This creates an effect where a user can look through ‘windows’ in the video image into a 3D simulated world, and/or see other enhanced or reprocessed features of the captured image.
  • Systems And Methods For Combining Virtual And Real-Time Physical Environments

    view source
  • US Patent:
    2010018, Jul 22, 2010
  • Filed:
    Jan 19, 2009
  • Appl. No.:
    12/356048
  • Inventors:
    Edward N. Bachelder - Redondo Beach CA,
    Noah Brickman - Ben Lomond CA,
  • International Classification:
    G09G 5/00
  • US Classification:
    345633, 345 8
  • Abstract:
    Systems, methods and structures for combining virtual reality and real-time environment by combining captured real-time video data and real-time 3D environment renderings to create a fused, that is, combined environment, including capturing video imagery in RGB or HSV/HSV color coordinate systems and processing it to determine which areas should be made transparent, or have other color modifications made, based on sensed cultural features, electromagnetic spectrum values, and/or sensor line-of-sight, wherein the sensed features can also include electromagnetic radiation characteristics such as color, infra-red, ultra-violet light values, cultural features can include patterns of these characteristics, such as object recognition using edge detection, and whereby the processed image is then overlaid on, and fused into a 3D environment to combine the two data sources into a single scene to thereby create an effect whereby a user can look through predesignated areas or “windows” in the video image to see into a 3D simulated world, and/or see other enhanced or reprocessed features of the captured image.

Get Report for Edward N Bachelder from Manhattan Beach, CA, age ~58
Control profile