METHOD, COMPUTER PROGRAM AND APPARATUS FOR DETERMINING MOTION INFORMATION FIELD OF THE INVENTION

Size: px
Start display at page:

Download "METHOD, COMPUTER PROGRAM AND APPARATUS FOR DETERMINING MOTION INFORMATION FIELD OF THE INVENTION"

Transcription

1 1 METHOD, COMPUTER PROGRAM AND APPARATUS FOR DETERMINING MOTION INFORMATION FIELD OF THE INVENTION The present invention relates to motion 5tracking. More particularly, the present invention relates to a method, a computer program and an apparatus for determining motion information of an object from a video signal. 10BACKGROUND OF THE INVENTION Motion capture is a term that describes the process of recording motion and translating the motion onto to a digital model. Presently there are available different kinds of commercial products which are based 15on various technologies, e.g. optical systems and nonoptical systems. Some of the optical systems are based on a usage of special optical markers the movements of which are recorded on a video. To reproduce movement the video is analyzed using various techniques. 20Examples of the non-optical systems include e.g. various kinds of inertial sensors. When using intertial sensors the measurement information has to be transmitted to a receiving entity either in a wired or wireless manner. 25 The existing motion capture systems have several drawbacks. Often the systems, e.g. optical systems are complex, require much processing power and need special equipment to work. Furthermore, if nonoptical systems are used, there is a need for a 30special receiver receiving the transmission from the wired or wireless transmitter. Based on the above there is an obvious need for a solution that would mitigate and/or alleviate the above drawbacks. 35

2 2 SUMMARY OF THE INVENTION According to an aspect there is provided a method for determining motion information of an object from a video signal. The method comprises: receiving 5the video signal, the video signal comprising a plurality of frames representing motion; analysing the video signal to determine at least one identifiable object in the plurality of frames of the video signal; and determining motion information of the object 10orthogonal to the two dimensional planar motion of the video signal by tracking changes in optical properties of the at least one identifiable object in the plurality of frames of the video signal. According to another aspect of the invention 15there is provided a computer program comprising program code, which when executed on a processor implement the method of any of claims According to another aspect of the invention there is provided an apparatus for determining motion 20information of an object from a video signal. The apparatus comprises: a processor; means for receiving the video signal; and a memory comprising a computer program, which when executed by the processor is configured to: receive the video signal, the video 25signal comprising a plurality of frames representing motion; analyse the video signal to determine at least one identifiable object in the plurality of frames of the video signal; and determine motion information of the object orthogonal to the two dimensional planar 30motion of the video signal by tracking changes in optical properties of the at least one identifiable object in the plurality of frames of the video signal. According to yet another aspect of the invention there is provided a system for determining 35motion information of an object from a video signal. The system comprises: a video camera; at least one identifiable object attachable to a target; an

3 3 apparatus comprising a processor; means for receiving the video signal; and a memory comprising a computer program, which when executed by the processor is configured to: receive the video signal, the video 5signal comprising a plurality of frames representing motion; analyse the video signal to determine at least one identifiable object in the plurality of frames of the video signal; and determine motion information of the object orthogonal to the two dimensional planar 10motion of the video signal by tracking changes in optical properties of the at least one identifiable object in the plurality of frames of the video signal. The advantages of the invention relate to simplicity. With the invention it is possible to track 15three-dimensional motion by using only one video camera. Furthermore, since the motion information is coded into a video signal, the decoding can be done regardless of the location. 20BRIEF DESCRIPTION OF THE DRAWINGS The accompanying drawings, which are included to provide a further understanding of the invention and constitute a part of this specification, illustrate embodiments of the invention and together 25with the description help to explain the principles of the invention. In the drawings: Figure 1 discloses a flow diagram of a method according to one embodiment of the invention, Figure 2 discloses an initial arrangement 30according to one embodiment of the invention, Figures 3A and 3B disclose an embodiment for decoding three-dimentional movement from a video signal, Figures 4A and 4B disclose another embodiment 35for decoding three-dimentional movement from a video signal,

4 4 Figures 5A and 5B disclose another embodiment for decoding three-dimentional movement from a video signal, Figure 6 discloses an apparatus according to 5one embodiment of the invention, and Figure 7 disclosed a system according to one embodiment of the invention. DETAILED DESCRIPTION OF THE INVENTION 10 Reference will now be made in detail to the embodiments of the present invention, examples of which are illustrated in the accompanying drawings. Figure 1 discloses a block diagram of illustrating one embodiment of the invention. At step 15100, a video signal is received. The video signal comprises a plurality of frames. The signal itself may consist any suitable frame rate, e.g. 24fps or 30fps or higher or even less. Furthermore, the pixel size of a single frame depends on a camera used to record the 20video signal. At step 102 the video signal is analysed to determine at least one identifiable object in the plurality of frames of the video signal. The identifiable object refers e.g. to a particular shape, colour or optical element which is identifiable from a 25frame of the video signal. At step 104 motion information of the object orthogonal to the two dimensional planar motion of the video signal is determined by tracking changes in optical properties of the at least one identifiable object in the 30plurality of frames of the video signal. In general this means that motion information of the object orthogonal to the two dimensional planar motion of the video signal can be determined from a two-dimensional video signal. This makes possible e.g. to capture and 35determine movement information of the object in each three dimension based on a video signal from a single camera.

5 5 Figure 2 illustrates an initial arrangement of the invention at hand. A video camera 24 is recording movements 20 of at least one object. The movements happen three-dimensionally, but the video 5camera 24 captures only a two-dimensional motion plane of the movements 24. Thus, the objective of the invention is how to capture three-dimensional movements from a video signal presenting only a twodimensional motion plane. Figures 3A and 3B disclose an embodiment in which motion information of an identifiable object 30 orthogonal to the two-dimentiosional planar motion component of the object 30 is determined by tracking changes in opticl properties of the objec 30t. In the embodiment of Figures 3A and 3B, the two-dimensional motion component of the object 30 is determined from the video signal by using e.g. one or more markers (not shown in Figures 3A and 3B). The marker, e.g. a passive marker is coated e.g. with a 20retroreflective material to reflect light back to a camera. The two-dimensional motion can then be tracked by tracking the position of the marker in the twodimentional planar video signal. In Figures 3A and 3B represent consecutive 25frames of the video signal, and motion information of the object 30 orthogonal to the two-dimensional planar motion component is calculated from changes in the relative size of the at least one identifiable object 30 in the frames. In Figure 3A the object 30 is 30represented with a black circle having position coordinates x 1, y 1, z 1. It is evident to a skilled person that there might not be any fixed coordinate system in use. A more important piece of information is the difference in coordinates (positions) between 35two consecutive frames. In Figure 3B the object 30 has moved (x 2, y 2, z 2 ) compared to the position in the

6 6 previous frame (Figure 2A). In this embodiment only the object 30 moves and the camera stays in place. In Figure 3A the relative size of the object 30 in the video signal has a fist value. In Figure 3B 5the relative size of the object 30 is bigger. Since the camera stays in place, the change in the relative size of the identifiable object 30 represents movement of the object to a direction orthogonal to the twodimensional planar motion component determined e.g. 10from the earlier mentioned marker. In short, the bigger the object 30 (the black circle in this case) is in a data frame, the closer it is to the camera. Furthermore, the accuracy of determining of motion in the direction orthogonal to the two- planar motion component may also depend on 15dimensional the total pixel size and the frame rate of the camera. For example, an inexpensive web camera may have a frame rate of 30 frames per second (fps) and a total pixel size of 640x480. It is evident that if the 20object moves fast, the movement is captured more accurately with a higher fps value. Furthermore, the size of a single pixel in a 640x480 web camera is much bigger than e.g. in a better three megapixel camera (e.g. 2048x1536 pixels). Thus, the accuracy of the 25motion information of the object 30 orthogonal to the two-dimensional planar motion component is more accurate when the total pixel size and the frame rate of the camera increases. Furthermore, it was mentioned above that a 30marker is separate from the light source. In another embodiment, the marker and the light source comprise a single element, i.e. the light source act also as a marker. Figures 4A and 4B disclose another embodiment 35of the invention at hand. In the embodiment of Figures 4A and 4B, motion of an object 40 in a direction orthogonal to the two-dimensional planar motion

7 7 component based on the optical properties of the object 40. In the embodiment of Figures 4A and 4B, the two-dimensional motion component of the object 40 is 5determined from the video signal by using e.g. one or more markers 42. The marker 42, e.g. a passive marker is coated e.g. with a retroreflective material to reflect light back to a camera. The two-dimensional motion can then be tracked by tracking the position of 10the marker 42 in the two-dimentional planar video signal. In the embodiment of Figures 4A and 4B, the object 40 is a light source. The ligth source is e.g. a light emitting diode (LED). Depth information, i.e. 15the third motion component of the object 40 is determined from the signals of the light source in the video signal. The object 40 (light source) is attached to a target. The target is e.g. a hand, first, foot or any other target the movement of which is to be 20tracked. As said above, the movement of the target is coded in a predetermined manner into signals of the light source 40 so that it is possible to determine the last motion component needed from the video signal. The coding is based on e.g. a usage of at 25least one accelerometer. The accelerometer itself may be based on any possible and applicable technology. The measurements of the accelerometer are coded into light signals so that it is possible to decode the previously encoded motion information afterwards. The 30coding is executed so that both the quantity and direction can be decoded from the signals of the light source 40. Furthermore, the intensity of light may also be used in the coding procedure, e.g. to indicate whether the motion is away or toward the camera. 35 In short, a video camera is used to record movements of a target. The target is e.g. a body part of a human being, e.g. a fist. The body part is

8 8 equipped with at least one accelerometer which provides information about the moving body part. An accelerometer tracks movements of the body part, e.g. a fist. Measurements of the accelerometer are coded 5into a light signal of at least one light source 40 in a predetermined manner. It is evident that the target may be equipped with multiple acceletometer in different locations and thus also multiple light sources 40 providing coded movement information of the 10multiple accelerometers. Since the video camera is used to record movements of the target, it also tracks the light source 40. In other words, the coded light signal of the light source 40 appears on the video signal. The 15video signal is then transmitted to a recipient via a communication network, e.g. the Internet. Alternatively, the video may be locally connected to a receiving processing device. At the receiving end, a data processing device, e.g. a computer is provided 20with a software application which is arranged to analyse the video signal to track two-dimensional planar motion component of the object. The twodimensional planar motion component is determined e.g. based on signals provided by the at least one passive 25markers or by any other suitable technique. Several techniques for tracking two-dimensional planar motion of a target are known in the art, and therefore they are not described here in more detail. In addition to the two-dimensional planar motion component, the 30software application determines motion in a direction orthogonal to the two-dimensional planar motion component from the signals of the light source, the signals of the light source having been modulated based on motion of the object, e.g. the fist already 35mentioned above. Based on the two-dimensional planar motion component and the modulated light signals, the software application is able to determine movements

9 9 three-dimensionally. The decoding algorithm used in the software application to decode light signals from the video signal is apparent to a skilled person. In short, the software application combines the two- information and motion information from 5dimensional the light source into three-dimensional motion information. Furthermore, it was mentioned above that a marker is separate from the light source. In another 10embodiment, the marker and the light source comprise a single element, i.e. the light source act also as a marker. A clear advantage of the solution disclosed in Figures 4A and 4B is that three-dimensional motion 15tracking can be achieved with only one camera and that there are no special requirements for the camera itself. Of course, it is evident that more accurate cameras may provide more accurate video signal. A further advantege is that decoding can be done 20practically anywhere without special equipment, only the software application is needed to decode the video signal. Figures 5A and 5B disclose another embodiment of the invention at hand. In the embodiment of Figures 255A and 5B, motion of an object in each three dimension is determined based on the optical properties of the object. In the embodiment of Figures 5A and 5B, the object is a set of three light sources: a red light 30source 50, a green light source 52 and a blue light source 54. Each of the light sources comprise e.g. a light emitting diode (LED). The individual light sources may be set so close to each other that they appear as a single point of light to an ourside 35observer. Instead of three led light sources it is possible to use e.g. layered displays (light sources) where the three RGB components are superimposed.

10 10 The light sources 50, 52, 54 are attached to a target. The target is e.g. a hand, foot or any other target the movement of which is to be tracked. The movement of the target is coded in a predetermined 5manner into signals of the light sources 50, 52, 54 so that it is possible to determine the last motion component needed from the video signal. The coding is based on e.g. a usage of at least one accelerometer. The object may comprise a 10built-in three-axis accelerometer for tracking motion. In another embodiment, the accelerometer is an external component from the object. The accelerometer itself may be based on any possible and applicable technology. The measurements of the accelerometer are 15coded into light signals so that it is possible to decode the previously encoded motion information afterwards. In this embodiment, each axis (X, Y, Z) is coded into a corresponding light source. For example, the X-axis is coded into the red light source 50, the 20Y-axis is coded into the green light source 52 and the Z-axis is coded into the blue light source 54. The coding is executed so that both the quantity and direction can be decoded from the signals of the light sources 50, 52, 54. Furthermore, the intensity of 25light may also be used in the coding procedure, e.g. to indicate whether the motion is away or toward the camera. In another embodiment, intensities of the three light sources may be used in the coding procedure. The coding procedure may then e.g. use the 30proportions of the intensities of the light sources in the coding procedure. In short, a video camera is used to record movements of a target. The target is e.g. a body part of a human being, e.g. a fist. The body part is 35equipped with a three-axis accelerometer which provides information about the moving body part. An accelerometer tracks movements of the body part, e.g.

11 11 a fist. Measurements of the accelerometer are coded into light signals of three light sources 50, 52, 54 in a predetermined manner. Since the video camera is used to record 5movements of the target, it also tracks the light sources 50, 52, 54. In other words, the coded light signal of the light sources 50, 52, 54 appears on the video signal. The video signal is then transmitted to a recipient via a communication network, e.g. the 10Internet. Alternatively, the video may be locally connected to a receiving processing device. At the receiving end, a data processing device, e.g. a computer is provided with a software application which is arranged to analyse the video signal to determine 15motion in a first direction from signals of the red light source, the signals of the red light source having been modulated based on motion in the first direction. Furthermore, the software application determines motion in a second direction from signals 20of the green light source, the signals of the green light source having been modulated based on motion in the second direction and motion in a third direction from signals of the blue light source, the signals of the blue light source having been modulated based on 25motion in the third direction. Based on the decoded three-dimensional motion components, the software application is able to determine movements three-dimensionally. The decoding algorithm used in the software application to decode 30light signals from the video signal itsels is apparent 35 to a skilled person. In short, the software application decodes all the needed three-dimensional motion information from the video signal, where the video signal comprises RGB coded motion information. A clear advantage of the solution disclosed in Figures 5A and 5B is that three-dimensional motion tracking can be achieved with only one camera and that

12 12 there are no special requirements for the camera itself. Of course, it is evident that more accurate cameras may provide more accurate video signal. A further advantege is that decoding can be done 5practically anywhere without special equipment, only the software application is needed to decode the video signal. In one embodiment of Figures 5A and 5B, decoding motion information from the three light 10sources is performed a bit differently. Motion is first determined from signals of a first light source. The signals of the first light source have previously been modulated as a function of absolute sum of acceleration. The second light source acts as static 15reference signal. Direction of the motion is then determined from signals of a third light source. Thus, the three-dimentional motion is determined based on the decoded signals from the first, second and third light sources. 20 Figure 6 discloses an apparatus 60 according to one embodiment of the invention. The apparatus 60 comprises a processor 64 to which is connected a memory 62 and a video signal receiver 64. The video signal receiver receives the video signal e.g. via a 25local connection or via a data interface connection, e.g. via a local area network interface of the apparatus. Although Figure 6 discloses only one memory 62 conneted to the processor 64, the apparatus may also include other memories. The memory 62 comprises a 30softare application which is configured to implement the motion decoding step disclosed in the description of Figures 3-5. The apparatus 60 may also include a receiver configured to receive a radio transmission. The 35processor 64 is then configured to determine motion in a direction orthogonal to the two-dimensional planar motion component by simultaneously recording motion

13 13 data received by means of the radio transmission from the object. The information received from the radio transmission can be used to combine the planar motion information from the video signal with the motion data 5received by the radio transmission and to correct and reference the motion information sent by the radio transmission. The receiver may receive the same information also via a wired interface. In the case of radio transmission, the earlier disclosed 10identifiable object may itself comprise a radio transmitter tranmitting information provided by the accelerometer. In another embodiment, the radio transmitter may be a separate entity from the object. Figure 7 discloses a system according to one 15embodiment of the invention. The system comprises one or more identifiable objects 70. The objects have been described in more detail previously in the description. The objects are a tracked with a video camera 72. The video camera 70 has a connection to a 20processing apparatus 74 decoding motion information from the video signal. The connection may be wireless or wired. Furthermore, The decoding process itself is implemented e.g. with a software application running in the processing apparatus. The processing apparatus 25itself may be any possible device, e.g. a personal computer, a laptop computer, a hand-held computer, a mobile terminal, a mobile phone, a gaming console, a personal digital assistant etc. The solution disclosed in the invention can 30be used e.g. in gaming applications and/or apparatuses, various sports, computer vision applications, telerehabilitation etc. The exemplary embodiments can include, for example, any suitable servers, workstations, PCs, lap- computers, personal digital assistants (PDAs), In- 35top ternet appliances, handheld devices, cellular telephones, smart phones, wireless devices, other devices,

14 14 and the like, capable of performing the processes of the exemplary embodiments. The devices and subsystems of the exemplary embodiments can communicate with each other using any suitable protocol and can be imple- using one or more programmed computer systems 5mented or devices. One or more interface mechanisms can be used with the exemplary embodiments, including, for example, Internet access, telecommunications in any suit- form (e.g., voice, modem, and the like), wireless 10able communications media, and the like. For example, employed communications networks or links can include one or more wireless communications networks, cellular communications networks, 3G communications networks, 15Public Switched Telephone Network (PSTNs), Packet Data Networks (PDNs), the Internet, intranets, a combination thereof, and the like. It is to be understood that the exemplary embodiments are for exemplary purposes, as many varia- of the specific hardware used to implement the 20tions exemplary embodiments are possible, as will be appreciated by those skilled in the hardware and/or software art(s). For example, the functionality of one or more of the components of the exemplary embodiments 25can be implemented via one or more hardware and/or software devices. The exemplary embodiments can store information relating to various processes described herein. This information can be stored in one or more memo- such as a hard disk, optical disk, magneto- 30ries, optical disk, RAM, and the like. One or more databases can store the information used to implement the exemplary embodiments of the present inventions. The databases can be organized using data structures (e.g., 35records, tables, arrays, fields, graphs, trees, lists, and the like) included in one or more memories or storage devices listed herein. The processes described

15 15 with respect to the exemplary embodiments can include appropriate data structures for storing data collected and/or generated by the processes of the devices and subsystems of the exemplary embodiments in one or more 5databases. All or a portion of the exemplary embodiments can be conveniently implemented using one or more general purpose processors, microprocessors, digital signal processors, micro-controllers, and the like, pro- according to the teachings of the exemplary 10grammed embodiments of the present inventions, as will be appreciated by those skilled in the computer and/or software art(s). Appropriate software can be readily prepared by programmers of ordinary skill based on the 15teachings of the exemplary embodiments, as will be appreciated by those skilled in the software art. In addition, the exemplary embodiments can be implemented by the preparation of application-specific integrated circuits or by interconnecting an appropriate network 20of conventional component circuits, as will be appreciated by those skilled in the electrical art(s). Thus, the exemplary embodiments are not limited to any specific combination of hardware and/or software. Stored on any one or on a combination of com- readable media, the exemplary embodiments of the 25puter present inventions can include software for controlling the components of the exemplary embodiments, for driving the components of the exemplary embodiments, for enabling the components of the exemplary embodi- to interact with a human user, and the like. 30ments Such software can include, but is not limited to, device drivers, firmware, operating systems, development tools, applications software, and the like. Such computer readable media further can include the computer 35program product of an embodiment of the present inventions for performing all or a portion (if processing is distributed) of the processing performed in imple-

16 16 menting the inventions. Computer code devices of the exemplary embodiments of the present inventions can include any suitable interpretable or executable code mechanism, including but not limited to scripts, in- programs, dynamic link libraries (DLLs), 5terpretable Java classes and applets, complete executable programs, Common Object Request Broker Architecture (CORBA) objects, and the like. Moreover, parts of the processing of the exemplary embodiments of the present 10inventions can be distributed for better performance, reliability, cost, and the like. As stated above, the components of the exemplary embodiments can include computer readable medium or memories for holding instructions programmed ac- to the teachings of the present inventions and 15cording for holding data structures, tables, records, and/or other data described herein. Computer readable medium can include any suitable medium that participates in providing instructions to a processor for execution. 20Such a medium can take many forms, including but not limited to, non-volatile media, volatile media, transmission media, and the like. Non-volatile media can include, for example, optical or magnetic disks, magneto-optical disks, and the like. Volatile media can 25include dynamic memories, and the like. Transmission media can include coaxial cables, copper wire, fiber optics, and the like. Transmission media also can take the form of acoustic, optical, electromagnetic waves, and the like, such as those generated during radio 30frequency (RF) communications, infrared (IR) data communications, and the like. Common forms of computerreadable media can include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other suitable magnetic medium, a CD-ROM, CDR, CD-RW, 35DVD, DVD-ROM, DVD±RW, DVD±R, any other suitable optical medium, punch cards, paper tape, optical mark sheets, any other suitable physical medium with pat-

17 17 terns of holes or other optically recognizable indicia, a RAM, a PROM, an EPROM, a FLASH-EPROM, any other suitable memory chip or cartridge, a carrier wave or any other suitable medium from which a computer can 5read. While the present inventions have been described in connection with a number of exemplary embodiments, and implementations, the present inventions are not so limited, but rather cover various modifica- and equivalent arrangements, which fall within 10tions, the purview of prospective claims.

18 18 CLAIMS: 1. A method for determining motion information of an object from a video signal, the method comprising: 5 receiving the video signal, the video signal comprising a plurality of frames representing motion; analysing the video signal to determine at least one identifiable object in the plurality of frames of 10 the video signal; and determining motion information of the object orthogonal to the two dimensional planar motion of the video signal by tracking changes in optical properties of the at least one identifiable object 15 in the plurality of frames of the video signal. 2. The method according to claim 1, further comprising: analysing the video signal to tracktwo-dimensional planar motion component of the object, 20 and wherein the determining comprises: determining relative size of the at least one identifiable object in the frames of the video signal; and calculating motion information of the object 25 orthogonal to the two-dimensional planar motion component from the changes in the relative size of the at least one identifiable object in the frames of the video signal. 3. The method according to claim 1, wherein the 30at least one identifiable object comprises at least one light source. 4. The method according to claim 3, wherein the determining comprises: analysing the video signal to track two-dimensional 35 planar motion component of the object, wherein the identifiable object is a light source and the determining comprises:

19 19 determining motion in a direction orthogonal to the two-dimensional planar motion component from signals of the light source, the signals of the light source having been modulated based on motion 5 of the object. 5. The method according to claim 3, wherein the at least one identifiable object comprises a red light source, a green light source and a blue light source, and the determining comprises: 10 determining motion in a first direction from signals of the red light source, the signals of the red light source having been modulated based on motion of the object in the first direction; determining motion in a second direction from 15 signals of the green light source, the signals of the green light source having been modulated based on motion of the object in the second direction; and determining motion in a third direction from signals 20 of the blue light source, the signals of the blue light source having been modulated based on motion of the object in the third direction. 6. The method according to any of claims 1-5, wherein the at least one identifiable object comprises 25three light sources, and wherein the determining comprises: determining motion from signals of a first light source, the signals of the first light source having been modulated as a function of absolute 30 sum of acceleration; determining a static reference signal from signals of a second light source; determining direction of the motion from signals of a third light source; and 35 determining the three-dimentional motion based on the decoded signals from the first, second and third light sources.

20 The method according to any of claims 1-6, further comprising: determining motion in a direction orthogonal to the two-dimensional planar motion component by simultaneously recording motion data received by means of radio transmission from the object. 8. The method according to claim 7, further comprising: combining the planar motion information from the video signal with the motion data received by the radio transmission, and correcting and referencing the motion information sent by the radio transmission. 9. The method according to any of claims 1-6, 15further comprising: determining motion in a direction orthogonal to the two-dimensional planar motion component by simultaneously recording motion data received by means of wire transmission from the tracked object. 10. The method according to claim 9, further comprising: combining the planar motion information from the video signal with the motion data received by the wire transmission, and correcting and referencing the motion information sent by the wire transmission. 11. A computer program comprising program code, which when executed on a processor implement the 30method of any of claims The computer program according to claim 8, wherein the computer program is embodied on a computer-readable medium. 13. An apparatus for determining motion 35information of an object from a video signal, the apparatus comprising: a processor;

21 21 means for receiving the video signal; and a memory comprising a computer program, which when executed by the processor is configured to: receive the video signal, the video signal 5 comprising a plurality of frames representing motion; analyse the video signal to determine at least one identifiable object in the plurality of frames of the video signal; and 10 determine motion information of the object orthogonal to the two dimensional planar motion of the video signal by tracking changes in optical properties of the at least one identifiable object in the plurality of frames of the video signal The apparatus according to claim 13, wherein the processor is configured to: analyse the video signal to tracktwo-dimensional planar motion component of the object; determine relative size of the at least one 20 identifiable object in the frames of the video signal; calculate motion information of the object orthogonal to the two-dimensional planar motion component from the changes in the relative size of 25 the at least one identifiable object in the frames of the video signal. 15. The apparatus according to claim 13, wherein the at least one identifiable object comprises at least one light source The apparatus according to claim 15, wherein the processor is configured to: analyse the video signal to track two-dimensional planar motion component of the object; and determine motion in a direction orthogonal to the 35 two-dimensional planar motion component from signals of the light source, the signals of the

22 22 light source having been modulated based on motion of the object; wherein the identifiable object is a light source and the determining comprises: 517. The apparatus according to claim 15, wherein the at least one identifiable object comprises a red light source, a green light source and a blue light source, and wherein the processor is configured to: determine motion in a first direction from signals 10 of the red light source, the signals of the red light source having been modulated based on motion of the object in the first direction; determine motion in a second direction from signals of the green light source, the signals of the 15 green light source having been modulated based on motion of the object in the second direction; and determine motion in a third direction from signals of the blue light source, the signals of the blue light source having been modulated based on motion 20 of the object in the third direction. 18. The apparatus according to claim 13-17, wherein the at least one identifiable object comprises three light sources, and wherein the processor is configured to: 25 determine motion from signals of a first light source, the signals of the first light source having been modulated as a function of absolute sum of acceleration; determine a static reference signal from signals of 30 a second light source; determine direction of the motion from signals of a third light source; and determine the three-dimentional motion based on the decoded signals from the first, second and third 35 light sources. 19. The apparatus according to any of claims 13-18, further comprising:

23 23 a receiver configured to receive a radio transmission; wherein the processor is configured to: determine motion in a direction orthogonal to the 5 two-dimensional planar motion component by simultaneously recording motion data received by means of the radio transmission from the object. 20. The apparatus according to claim 19, wherein the processor is configured to: 10 combine the planar motion information from the video signal with the motion data received by the radio transmission; and correct and reference the motion information sent by the radio transmission The apparatus according to any of claims 13-18, further comprising: a receiver configured to receive a wire transmission;wherein the processor is configured to: 20 determine motion in a direction orthogonal to the two-dimensional planar motion component by simultaneously recording motion data received by means of wire transmission from the tracked object The apparatus according to claim 21, wherein the processor is configured to: combine the planar motion information from the video signal with the motion data received by the wire transmission; and 30 correct and reference the motion information sent by the wire transmission. 23. A system for determining motion information of an object from a video signal, the system comprising: 35 a video camera; at least one identifiable object attachable to a target;

24 an apparatus comprising a processor; means for receiving the video signal; and a memory comprising a computer program, which when executed by the processor is configured to: receive the video signal, the video signal comprising a plurality of frames representing motion; analyse the video signal to determine at least one identifiable object in the plurality of frames of the video signal; and determine motion information of the object orthogonal to the two dimensional planar motion of the video signal by tracking changes in optical properties of the at least one identifiable object in the plurality of frames of the video signal.

25 ABSTRACT The invention provides a method for determining motion information of an object from a video signal. The method comprises: receiving the video signal, 5the video signal comprising a plurality of frames representing motion; analysing the video signal to determine at least one identifiable object in the plurality of frames of the video signal; and 10determining motion information of the object orthogonal to the two dimensional planar motion of the video signal by tracking changes in optical properties of the at least one identifiable object 15in the plurality of frames of the video signal. (FIG. 1)

26

27 V I D E O S I G N A L R E C E I V I N G T H E V I D E O S I G N A L, T H E V I D E O S I G N A L C O M P R I S I N G A P L U R A L I T Y O F F R A M E S A N A L Y S I N G T H E V I D E O S I G N A L T O D E T E R M I N E A T L E A S T O N E I D E N T I F I A B L E O B J E C T D E T E R M I N I N G M O T I O N I N F O R M A T I O N O F T H E O B J E C T O R T H O G O N A L T O T H E T W O D I M E N S I O N A L P L A N A R M O T I O N O F T H E V I D E O S I G N A L B Y T R A C K I N G C H A N G E S I N O P T I C A L P R O P E R T I E S O F T H E A T L E A S T O N E I D E N T I F I A B L E O B J E C T F I G. 1

28 F I G. 2

29 x, y, z F I G. 3 A x, y, z F I G. 3 B

30 4 2 x, y 1 1 z F I G. 4 A 4 2 x, y 2 2 z F I G. 4 B

31 x, y 1 1, z R G B 5 4 F I G. 5 A x, y 2 2, z R G B 5 4 F I G. 5 B

32 6 0 M E M O R Y 6 2 P R O C E S S O R 6 4 V I D E O S I G N A L R E C E I V E R 6 6 F I G I D E N T I F I A B L E V I D E O O B J E C T ( S ) C A M E R A 7 2 P R O C E S S I N G A P P A R A T U S 7 4 F I G. 7

(12) Patent Application Publication (10) Pub. No.: US 2006/ A1. (51) Int. Cl. SELECT A PLURALITY OF TIME SHIFT CHANNELS

(12) Patent Application Publication (10) Pub. No.: US 2006/ A1. (51) Int. Cl. SELECT A PLURALITY OF TIME SHIFT CHANNELS (19) United States (12) Patent Application Publication (10) Pub. No.: Lee US 2006OO15914A1 (43) Pub. Date: Jan. 19, 2006 (54) RECORDING METHOD AND APPARATUS CAPABLE OF TIME SHIFTING INA PLURALITY OF CHANNELS

More information

SELECTING A HIGH-VALENCE REPRESENTATIVE IMAGE BASED ON IMAGE QUALITY. Inventors: Nicholas P. Dufour, Mark Desnoyer, Sophie Lebrecht

SELECTING A HIGH-VALENCE REPRESENTATIVE IMAGE BASED ON IMAGE QUALITY. Inventors: Nicholas P. Dufour, Mark Desnoyer, Sophie Lebrecht Page 1 of 74 SELECTING A HIGH-VALENCE REPRESENTATIVE IMAGE BASED ON IMAGE QUALITY Inventors: Nicholas P. Dufour, Mark Desnoyer, Sophie Lebrecht TECHNICAL FIELD methods. [0001] This disclosure generally

More information

(12) Publication of Unexamined Patent Application (A)

(12) Publication of Unexamined Patent Application (A) Case #: JP H9-102827A (19) JAPANESE PATENT OFFICE (51) Int. Cl. 6 H04 M 11/00 G11B 15/02 H04Q 9/00 9/02 (12) Publication of Unexamined Patent Application (A) Identification Symbol 301 346 301 311 JPO File

More information

(12) Patent Application Publication (10) Pub. No.: US 2013/ A1

(12) Patent Application Publication (10) Pub. No.: US 2013/ A1 US 2013 0083040A1 (19) United States (12) Patent Application Publication (10) Pub. No.: US 2013/0083040 A1 Prociw (43) Pub. Date: Apr. 4, 2013 (54) METHOD AND DEVICE FOR OVERLAPPING (52) U.S. Cl. DISPLA

More information

(12) Patent Application Publication (10) Pub. No.: US 2004/ A1

(12) Patent Application Publication (10) Pub. No.: US 2004/ A1 (19) United States US 2004O184531A1 (12) Patent Application Publication (10) Pub. No.: US 2004/0184531A1 Lim et al. (43) Pub. Date: Sep. 23, 2004 (54) DUAL VIDEO COMPRESSION METHOD Publication Classification

More information

(12) United States Patent (10) Patent No.: US 6,275,266 B1

(12) United States Patent (10) Patent No.: US 6,275,266 B1 USOO6275266B1 (12) United States Patent (10) Patent No.: Morris et al. (45) Date of Patent: *Aug. 14, 2001 (54) APPARATUS AND METHOD FOR 5,8,208 9/1998 Samela... 348/446 AUTOMATICALLY DETECTING AND 5,841,418

More information

o VIDEO A United States Patent (19) Garfinkle u PROCESSOR AD OR NM STORE 11 Patent Number: 5,530,754 45) Date of Patent: Jun.

o VIDEO A United States Patent (19) Garfinkle u PROCESSOR AD OR NM STORE 11 Patent Number: 5,530,754 45) Date of Patent: Jun. United States Patent (19) Garfinkle 54) VIDEO ON DEMAND 76 Inventor: Norton Garfinkle, 2800 S. Ocean Blvd., Boca Raton, Fla. 33432 21 Appl. No.: 285,033 22 Filed: Aug. 2, 1994 (51) Int. Cl.... HO4N 7/167

More information

(12) United States Patent

(12) United States Patent (12) United States Patent USOO71 6 1 494 B2 (10) Patent No.: US 7,161,494 B2 AkuZaWa (45) Date of Patent: Jan. 9, 2007 (54) VENDING MACHINE 5,831,862 A * 11/1998 Hetrick et al.... TOOf 232 75 5,959,869

More information

(12) Patent Application Publication (10) Pub. No.: US 2015/ A1

(12) Patent Application Publication (10) Pub. No.: US 2015/ A1 US 20150358554A1 (19) United States (12) Patent Application Publication (10) Pub. No.: US 2015/0358554 A1 Cheong et al. (43) Pub. Date: Dec. 10, 2015 (54) PROACTIVELY SELECTINGA Publication Classification

More information

TEPZZ A_T EP A1 (19) (11) EP A1. (12) EUROPEAN PATENT APPLICATION published in accordance with Art.

TEPZZ A_T EP A1 (19) (11) EP A1. (12) EUROPEAN PATENT APPLICATION published in accordance with Art. (19) TEPZZ 8946 9A_T (11) EP 2 894 629 A1 (12) EUROPEAN PATENT APPLICATION published in accordance with Art. 13(4) EPC (43) Date of publication: 1.07.1 Bulletin 1/29 (21) Application number: 12889136.3

More information

(12) Patent Application Publication (10) Pub. No.: US 2005/ A1

(12) Patent Application Publication (10) Pub. No.: US 2005/ A1 (19) United States US 2005.0089284A1 (12) Patent Application Publication (10) Pub. No.: US 2005/0089284A1 Ma (43) Pub. Date: Apr. 28, 2005 (54) LIGHT EMITTING CABLE WIRE (76) Inventor: Ming-Chuan Ma, Taipei

More information

(12) Patent Application Publication (10) Pub. No.: US 2016/ A1

(12) Patent Application Publication (10) Pub. No.: US 2016/ A1 (19) United States (12) Patent Application Publication (10) Pub. No.: US 2016/0379551A1 Zhuang et al. US 20160379551A1 (43) Pub. Date: (54) (71) (72) (73) (21) (22) (51) (52) WEAR COMPENSATION FOR ADISPLAY

More information

2) }25 2 O TUNE IF. CHANNEL, TS i AUDIO

2) }25 2 O TUNE IF. CHANNEL, TS i AUDIO US 20050160453A1 (19) United States (12) Patent Application Publication (10) Pub. N0.: US 2005/0160453 A1 Kim (43) Pub. Date: (54) APPARATUS TO CHANGE A CHANNEL (52) US. Cl...... 725/39; 725/38; 725/120;

More information

(12) Patent Application Publication (10) Pub. No.: US 2010/ A1

(12) Patent Application Publication (10) Pub. No.: US 2010/ A1 (19) United States US 20100057781A1 (12) Patent Application Publication (10) Pub. No.: Stohr (43) Pub. Date: Mar. 4, 2010 (54) MEDIA IDENTIFICATION SYSTEMAND (52) U.S. Cl.... 707/104.1: 709/203; 707/E17.032;

More information

(12) Patent Application Publication (10) Pub. No.: US 2008/ A1

(12) Patent Application Publication (10) Pub. No.: US 2008/ A1 (19) United States US 2008O144051A1 (12) Patent Application Publication (10) Pub. No.: US 2008/0144051A1 Voltz et al. (43) Pub. Date: (54) DISPLAY DEVICE OUTPUT ADJUSTMENT SYSTEMAND METHOD (76) Inventors:

More information

(12) United States Patent (10) Patent No.: US 6,462,508 B1. Wang et al. (45) Date of Patent: Oct. 8, 2002

(12) United States Patent (10) Patent No.: US 6,462,508 B1. Wang et al. (45) Date of Patent: Oct. 8, 2002 USOO6462508B1 (12) United States Patent (10) Patent No.: US 6,462,508 B1 Wang et al. (45) Date of Patent: Oct. 8, 2002 (54) CHARGER OF A DIGITAL CAMERA WITH OTHER PUBLICATIONS DATA TRANSMISSION FUNCTION

More information

IN THE UNITED STATES PATENT AND TRADEMARK OFFICE TITLE OF THE INVENTION

IN THE UNITED STATES PATENT AND TRADEMARK OFFICE TITLE OF THE INVENTION Atty. Docket No.: UAZ-001100PV UAZ Ref. No.: UA13-130 Provisional Application IN THE UNITED STATES PATENT AND TRADEMARK OFFICE TITLE OF THE INVENTION GESTURE IDENTIFICATION AND REPLICATION Inventors: ALON

More information

(12) United States Patent

(12) United States Patent US0079623B2 (12) United States Patent Stone et al. () Patent No.: (45) Date of Patent: Apr. 5, 11 (54) (75) (73) (*) (21) (22) (65) (51) (52) (58) METHOD AND APPARATUS FOR SIMULTANEOUS DISPLAY OF MULTIPLE

More information

(12) Patent Application Publication (10) Pub. No.: US 2017/ A1. (51) Int. Cl. (52) U.S. Cl. M M 110 / <E

(12) Patent Application Publication (10) Pub. No.: US 2017/ A1. (51) Int. Cl. (52) U.S. Cl. M M 110 / <E (19) United States US 20170082735A1 (12) Patent Application Publication (10) Pub. No.: US 2017/0082735 A1 SLOBODYANYUK et al. (43) Pub. Date: ar. 23, 2017 (54) (71) (72) (21) (22) LIGHT DETECTION AND RANGING

More information

DISTRIBUTION STATEMENT A 7001Ö

DISTRIBUTION STATEMENT A 7001Ö Serial Number 09/678.881 Filing Date 4 October 2000 Inventor Robert C. Higgins NOTICE The above identified patent application is available for licensing. Requests for information should be addressed to:

More information

E. R. C. E.E.O. sharp imaging on the external surface. A computer mouse or

E. R. C. E.E.O. sharp imaging on the external surface. A computer mouse or USOO6489934B1 (12) United States Patent (10) Patent No.: Klausner (45) Date of Patent: Dec. 3, 2002 (54) CELLULAR PHONE WITH BUILT IN (74) Attorney, Agent, or Firm-Darby & Darby OPTICAL PROJECTOR FOR DISPLAY

More information

(12) United States Patent (10) Patent No.: US 8, B2

(12) United States Patent (10) Patent No.: US 8, B2 USOO83848O1B2 (12) United States Patent (10) Patent No.: US 8,384.801 B2 Hung et al. (45) Date of Patent: Feb. 26, 2013 (54) SCENE-DEPENDENT AUTOEXPOSURE 6,836,588 B1 12/2004 Zeng CONTROL 2007/0070216

More information

(12) United States Patent Nagashima et al.

(12) United States Patent Nagashima et al. (12) United States Patent Nagashima et al. US006953887B2 (10) Patent N0.: (45) Date of Patent: Oct. 11, 2005 (54) SESSION APPARATUS, CONTROL METHOD THEREFOR, AND PROGRAM FOR IMPLEMENTING THE CONTROL METHOD

More information

(12) Patent Application Publication (10) Pub. No.: US 2007/ A1

(12) Patent Application Publication (10) Pub. No.: US 2007/ A1 US 20070011710A1 (19) United States (12) Patent Application Publication (10) Pub. No.: Chiu (43) Pub. Date: Jan. 11, 2007 (54) INTERACTIVE NEWS GATHERING AND Publication Classification MEDIA PRODUCTION

More information

(12) Patent Application Publication (10) Pub. No.: US 2015/ A1

(12) Patent Application Publication (10) Pub. No.: US 2015/ A1 (19) United States US 2015.0054800A1 (12) Patent Application Publication (10) Pub. No.: US 2015/0054800 A1 KM et al. (43) Pub. Date: Feb. 26, 2015 (54) METHOD AND APPARATUS FOR DRIVING (30) Foreign Application

More information

TRANSMITTING SPORTS AND ENTERTAINMENT DATA TO WIRELESS HAND HELD DEVICES OVER A TELECOMMUNICATIONS NETWORK CROSS-REFERENCE TO RELATED APPLICATIONS

TRANSMITTING SPORTS AND ENTERTAINMENT DATA TO WIRELESS HAND HELD DEVICES OVER A TELECOMMUNICATIONS NETWORK CROSS-REFERENCE TO RELATED APPLICATIONS TRANSMITTING SPORTS AND ENTERTAINMENT DATA TO WIRELESS HAND HELD DEVICES OVER A TELECOMMUNICATIONS NETWORK CROSS-REFERENCE TO RELATED APPLICATIONS [0001] This application is a continuation of U.S. Serial

More information

Chapter 1. Introduction to Digital Signal Processing

Chapter 1. Introduction to Digital Signal Processing Chapter 1 Introduction to Digital Signal Processing 1. Introduction Signal processing is a discipline concerned with the acquisition, representation, manipulation, and transformation of signals required

More information

(12) United States Patent (10) Patent No.: US 6,885,157 B1

(12) United States Patent (10) Patent No.: US 6,885,157 B1 USOO688.5157B1 (12) United States Patent (10) Patent No.: Cok et al. (45) Date of Patent: Apr. 26, 2005 (54) INTEGRATED TOUCH SCREEN AND OLED 6,504,530 B1 1/2003 Wilson et al.... 345/173 FLAT-PANEL DISPLAY

More information

(12) Patent Application Publication (10) Pub. No.: US 2010/ A1

(12) Patent Application Publication (10) Pub. No.: US 2010/ A1 (19) United States US 2010.0020005A1 (12) Patent Application Publication (10) Pub. No.: US 2010/0020005 A1 Jung et al. (43) Pub. Date: Jan. 28, 2010 (54) APPARATUS AND METHOD FOR COMPENSATING BRIGHTNESS

More information

CROSS-REFERENCE TO RELATED APPLICATIONS

CROSS-REFERENCE TO RELATED APPLICATIONS TRANSMITTING SPORTS AND ENTERTAINMENT DATA TO WIRELESS HAND HELD DEVICES OVER A TELECOMMUNICATIONS NETWORK CROSS-REFERENCE TO RELATED APPLICATIONS [001] This application is a continuation of U.S. Serial

More information

(12) Patent Application Publication (10) Pub. No.: US 2006/ A1

(12) Patent Application Publication (10) Pub. No.: US 2006/ A1 US 20060095317A1 (19) United States (12) Patent Application Publication (10) Pub. No.: US 2006/0095317 A1 BrOWn et al. (43) Pub. Date: May 4, 2006 (54) SYSTEM AND METHOD FORMONITORING (22) Filed: Nov.

More information

Innovations in PON Cost Reduction

Innovations in PON Cost Reduction Innovations in PON Cost Reduction Abstract Passive Optical Network (PON) deployments become a reality only when the promised price of a Fiber To The Premise (FTTP) network met the carrier s objectives

More information

P1: OTA/XYZ P2: ABC c01 JWBK457-Richardson March 22, :45 Printer Name: Yet to Come

P1: OTA/XYZ P2: ABC c01 JWBK457-Richardson March 22, :45 Printer Name: Yet to Come 1 Introduction 1.1 A change of scene 2000: Most viewers receive analogue television via terrestrial, cable or satellite transmission. VHS video tapes are the principal medium for recording and playing

More information

(12) Patent Application Publication (10) Pub. No.: US 2011/ A1

(12) Patent Application Publication (10) Pub. No.: US 2011/ A1 US 2011 0016428A1 (19) United States (12) Patent Application Publication (10) Pub. No.: US 2011/0016428A1 Lupton, III et al. (43) Pub. Date: (54) NESTED SCROLLING SYSTEM Publication Classification O O

More information

(12) Patent Application Publication (10) Pub. No.: US 2012/ A1

(12) Patent Application Publication (10) Pub. No.: US 2012/ A1 (19) United States US 2012O114336A1 (12) Patent Application Publication (10) Pub. No.: US 2012/0114336A1 Kim et al. (43) Pub. Date: May 10, 2012 (54) (75) (73) (21) (22) (60) NETWORK DGITAL SIGNAGE SOLUTION

More information

(12) United States Patent

(12) United States Patent USOO9709605B2 (12) United States Patent Alley et al. (10) Patent No.: (45) Date of Patent: Jul.18, 2017 (54) SCROLLING MEASUREMENT DISPLAY TICKER FOR TEST AND MEASUREMENT INSTRUMENTS (71) Applicant: Tektronix,

More information

Pattern Based Attendance System using RF module

Pattern Based Attendance System using RF module Pattern Based Attendance System using RF module 1 Bishakha Samantaray, 2 Megha Sutrave, 3 Manjunath P S Department of Telecommunication Engineering, BMS College of Engineering, Bangalore, India Email:

More information

ICT goods categories and composition (HS 2012)

ICT goods categories and composition (HS 2012) ICT00 Total ICT goods ICT01 Computers and peripheral equipment 844331 Machines which perform two or more of the functions of printing, copying or facsimile transmission, capable of connecting to an automatic

More information

Generating Flower Images and Shapes with Compositional Pattern Producing Networks

Generating Flower Images and Shapes with Compositional Pattern Producing Networks University of Central Florida UCF Patents Patent Generating Flower Images and Shapes with Compositional Pattern Producing Networks 3-17-2015 Kenneth Stanley University of Central Florida David D'Ambrosio

More information

(12) United States Patent (10) Patent No.: US 7,605,794 B2

(12) United States Patent (10) Patent No.: US 7,605,794 B2 USOO7605794B2 (12) United States Patent (10) Patent No.: Nurmi et al. (45) Date of Patent: Oct. 20, 2009 (54) ADJUSTING THE REFRESH RATE OFA GB 2345410 T 2000 DISPLAY GB 2378343 2, 2003 (75) JP O309.2820

More information

USOO A United States Patent (19) 11 Patent Number: 5,822,052 Tsai (45) Date of Patent: Oct. 13, 1998

USOO A United States Patent (19) 11 Patent Number: 5,822,052 Tsai (45) Date of Patent: Oct. 13, 1998 USOO5822052A United States Patent (19) 11 Patent Number: Tsai (45) Date of Patent: Oct. 13, 1998 54 METHOD AND APPARATUS FOR 5,212,376 5/1993 Liang... 250/208.1 COMPENSATING ILLUMINANCE ERROR 5,278,674

More information

(12) Patent Application Publication (10) Pub. No.: US 2013/ A1

(12) Patent Application Publication (10) Pub. No.: US 2013/ A1 US 2013 0127749A1 (19) United States (12) Patent Application Publication (10) Pub. No.: US 2013/0127749 A1 YAMAMOTO et al. (43) Pub. Date: May 23, 2013 (54) ELECTRONIC DEVICE AND TOUCH Publication Classification

More information

SUMMIT LAW GROUP PLLC 315 FIFTH AVENUE SOUTH, SUITE 1000 SEATTLE, WASHINGTON Telephone: (206) Fax: (206)

SUMMIT LAW GROUP PLLC 315 FIFTH AVENUE SOUTH, SUITE 1000 SEATTLE, WASHINGTON Telephone: (206) Fax: (206) Case 2:10-cv-01823-JLR Document 154 Filed 01/06/12 Page 1 of 153 1 The Honorable James L. Robart 2 3 4 5 6 7 UNITED STATES DISTRICT COURT FOR THE WESTERN DISTRICT OF WASHINGTON AT SEATTLE 8 9 10 11 12

More information

(12) Patent Application Publication (10) Pub. No.: US 2007/ A1

(12) Patent Application Publication (10) Pub. No.: US 2007/ A1 (19) United States (12) Patent Application Publication (10) Pub. No.: US 2007/0230902 A1 Shen et al. US 20070230902A1 (43) Pub. Date: Oct. 4, 2007 (54) (75) (73) (21) (22) (60) DYNAMIC DISASTER RECOVERY

More information

(12) Patent Application Publication (10) Pub. No.: US 2013/ A1

(12) Patent Application Publication (10) Pub. No.: US 2013/ A1 (19) United States US 2013 0100156A1 (12) Patent Application Publication (10) Pub. No.: US 2013/0100156A1 JANG et al. (43) Pub. Date: Apr. 25, 2013 (54) PORTABLE TERMINAL CAPABLE OF (30) Foreign Application

More information

(12) Patent Application Publication (10) Pub. No.: US 2014/ A1

(12) Patent Application Publication (10) Pub. No.: US 2014/ A1 (19) United States US 2014O1 O1585A1 (12) Patent Application Publication (10) Pub. No.: US 2014/0101585 A1 YOO et al. (43) Pub. Date: Apr. 10, 2014 (54) IMAGE PROCESSINGAPPARATUS AND (30) Foreign Application

More information

USOO A United States Patent (19) 11 Patent Number: 5,850,807 Keeler (45) Date of Patent: Dec. 22, 1998

USOO A United States Patent (19) 11 Patent Number: 5,850,807 Keeler (45) Date of Patent: Dec. 22, 1998 USOO.5850807A United States Patent (19) 11 Patent Number: 5,850,807 Keeler (45) Date of Patent: Dec. 22, 1998 54). ILLUMINATED PET LEASH Primary Examiner Robert P. Swiatek Assistant Examiner James S. Bergin

More information

TEPZZ 889A_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION. (43) Date of publication: Bulletin 2017/35

TEPZZ 889A_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION. (43) Date of publication: Bulletin 2017/35 (19) TEPZZ 889A_T (11) EP 3 211 889 A1 (12) EUROPEAN PATENT APPLICATION (43) Date of publication:.08.17 Bulletin 17/3 (21) Application number: 163970. (22) Date of filing: 26.02.16 (1) Int Cl.: H04N 7/

More information

(12) United States Patent (10) Patent No.: US 6,684,249 B1. Frerichs et al. (45) Date of Patent: Jan. 27, 2004

(12) United States Patent (10) Patent No.: US 6,684,249 B1. Frerichs et al. (45) Date of Patent: Jan. 27, 2004 USOO6684249B1 (12) United States Patent (10) Patent No.: US 6,684,249 B1 Frerichs et al. (45) Date of Patent: Jan. 27, 2004 (54) METHOD AND SYSTEM FOR ADDING 5,917,830 A 6/1999 Chen et al. ADVERTISEMENTS

More information

(12) United States Patent (10) Patent No.: US 6,424,795 B1

(12) United States Patent (10) Patent No.: US 6,424,795 B1 USOO6424795B1 (12) United States Patent (10) Patent No.: Takahashi et al. () Date of Patent: Jul. 23, 2002 (54) METHOD AND APPARATUS FOR 5,444,482 A 8/1995 Misawa et al.... 386/120 RECORDING AND REPRODUCING

More information

(12) United States Patent

(12) United States Patent (12) United States Patent Ali USOO65O1400B2 (10) Patent No.: (45) Date of Patent: Dec. 31, 2002 (54) CORRECTION OF OPERATIONAL AMPLIFIER GAIN ERROR IN PIPELINED ANALOG TO DIGITAL CONVERTERS (75) Inventor:

More information

United States Patent 19 11) 4,450,560 Conner

United States Patent 19 11) 4,450,560 Conner United States Patent 19 11) 4,4,560 Conner 54 TESTER FOR LSI DEVICES AND DEVICES (75) Inventor: George W. Conner, Newbury Park, Calif. 73 Assignee: Teradyne, Inc., Boston, Mass. 21 Appl. No.: 9,981 (22

More information

(12) Patent Application Publication (10) Pub. No.: US 2006/ A1. (51) Int. Cl.

(12) Patent Application Publication (10) Pub. No.: US 2006/ A1. (51) Int. Cl. (19) United States US 20060034.186A1 (12) Patent Application Publication (10) Pub. No.: US 2006/0034186 A1 Kim et al. (43) Pub. Date: Feb. 16, 2006 (54) FRAME TRANSMISSION METHOD IN WIRELESS ENVIRONMENT

More information

(12) Patent Application Publication (10) Pub. No.: US 2003/ A1

(12) Patent Application Publication (10) Pub. No.: US 2003/ A1 (19) United States US 2003O126595A1 (12) Patent Application Publication (10) Pub. No.: US 2003/0126595 A1 Sie et al. (43) Pub. Date: Jul. 3, 2003 (54) SYSTEMS AND METHODS FOR PROVIDING MARKETING MESSAGES

More information

(51) Int Cl.: H04L 1/00 ( )

(51) Int Cl.: H04L 1/00 ( ) (19) TEPZZ Z4 497A_T (11) EP 3 043 497 A1 (12) EUROPEAN PATENT APPLICATION published in accordance with Art. 153(4) EPC (43) Date of publication: 13.07.2016 Bulletin 2016/28 (21) Application number: 14842584.6

More information

Chapter 7 Memory and Programmable Logic

Chapter 7 Memory and Programmable Logic EEA091 - Digital Logic 數位邏輯 Chapter 7 Memory and Programmable Logic 吳俊興國立高雄大學資訊工程學系 2006 Chapter 7 Memory and Programmable Logic 7-1 Introduction 7-2 Random-Access Memory 7-3 Memory Decoding 7-4 Error

More information

(12) Patent Application Publication (10) Pub. No.: US 2015/ A1

(12) Patent Application Publication (10) Pub. No.: US 2015/ A1 (19) United States (12) Patent Application Publication (10) Pub. No.: US 2015/0116196A1 Liu et al. US 2015O11 6 196A1 (43) Pub. Date: Apr. 30, 2015 (54) (71) (72) (73) (21) (22) (86) (30) LED DISPLAY MODULE,

More information

(12) United States Patent (10) Patent No.: US 6,462,786 B1

(12) United States Patent (10) Patent No.: US 6,462,786 B1 USOO6462786B1 (12) United States Patent (10) Patent No.: Glen et al. (45) Date of Patent: *Oct. 8, 2002 (54) METHOD AND APPARATUS FOR BLENDING 5,874.967 2/1999 West et al.... 34.5/113 IMAGE INPUT LAYERS

More information

(12) (10) Patent No.: US 8,316,390 B2. Zeidman (45) Date of Patent: Nov. 20, 2012

(12) (10) Patent No.: US 8,316,390 B2. Zeidman (45) Date of Patent: Nov. 20, 2012 United States Patent USOO831 6390B2 (12) (10) Patent No.: US 8,316,390 B2 Zeidman (45) Date of Patent: Nov. 20, 2012 (54) METHOD FOR ADVERTISERS TO SPONSOR 6,097,383 A 8/2000 Gaughan et al.... 345,327

More information

(12) United States Patent

(12) United States Patent (12) United States Patent Swan USOO6304297B1 (10) Patent No.: (45) Date of Patent: Oct. 16, 2001 (54) METHOD AND APPARATUS FOR MANIPULATING DISPLAY OF UPDATE RATE (75) Inventor: Philip L. Swan, Toronto

More information

UNIT V 8051 Microcontroller based Systems Design

UNIT V 8051 Microcontroller based Systems Design UNIT V 8051 Microcontroller based Systems Design INTERFACING TO ALPHANUMERIC DISPLAYS Many microprocessor-controlled instruments and machines need to display letters of the alphabet and numbers. Light

More information

(12) Patent Application Publication (10) Pub. No.: US 2017/ A1

(12) Patent Application Publication (10) Pub. No.: US 2017/ A1 (19) United States US 201701 18527A1 (12) Patent Application Publication (10) Pub. No.: US 2017/0118527 A1 Wachob et al. (43) Pub. Date: Apr. 27, 2017 (54) SYSTEM AND METHOD FOR PROVIDING H04N 7/2 (2006.01)

More information

Digital Logic Design: An Overview & Number Systems

Digital Logic Design: An Overview & Number Systems Digital Logic Design: An Overview & Number Systems Analogue versus Digital Most of the quantities in nature that can be measured are continuous. Examples include Intensity of light during the day: The

More information

LED TEST. ATX Hardware GmbH West and Feasa Enterprises Limited a long-standing partnership. SPECIALIST FIXTURE SOLUTIONS

LED TEST. ATX Hardware GmbH West and Feasa Enterprises Limited a long-standing partnership. SPECIALIST FIXTURE SOLUTIONS SPECIALIST FIXTURE SOLUTIONS LED TEST ATX Hardware GmbH West and Feasa Enterprises Limited a long-standing partnership. Two can provide more expertise and specialisation than one! ATX is established specialists

More information

Automatic optimization of image capture on mobile devices by human and non-human agents

Automatic optimization of image capture on mobile devices by human and non-human agents Automatic optimization of image capture on mobile devices by human and non-human agents 1.1 Abstract Sophie Lebrecht, Mark Desnoyer, Nick Dufour, Zhihao Li, Nicole A. Halmi, David L. Sheinberg, Michael

More information

CABLE MODEM. COURSE INSTRUCTOR Prof.Andreas Schrader

CABLE MODEM. COURSE INSTRUCTOR Prof.Andreas Schrader CABLE MODEM COURSE INSTRUCTOR Prof.Andreas Schrader Imran Ahmad ISNM 2003 Cable Modem What is cable modem The cable modem is another technology, which has recently emerged into the home user Market. It

More information

DIVISION 28. systems. conditions. GENERAL PART 1 PART 2 PRODUCTS. Products, Inc. (2) The. (3) The one modules. (4) The. to CD-R, CD- technology.

DIVISION 28. systems. conditions. GENERAL PART 1 PART 2 PRODUCTS. Products, Inc. (2) The. (3) The one modules. (4) The. to CD-R, CD- technology. VITEK CHRONO SERIES 8 CHANNEL DIGITAL VIDEO RECORDER DIVISION 28 ELECTRONIC SAFETY AND SECURITY 28 20 00 ELECTRONICC SURVEILLANCE 28 23 00 VIDEO SURVEILLANCE 28 23 29 VIDEO SURVEILLANCE REMOTE DEVICES

More information

EP A2 (19) (11) EP A2 (12) EUROPEAN PATENT APPLICATION. (43) Date of publication: Bulletin 2011/39

EP A2 (19) (11) EP A2 (12) EUROPEAN PATENT APPLICATION. (43) Date of publication: Bulletin 2011/39 (19) (12) EUROPEAN PATENT APPLICATION (11) EP 2 368 716 A2 (43) Date of publication: 28.09.2011 Bulletin 2011/39 (51) Int Cl.: B41J 3/407 (2006.01) G06F 17/21 (2006.01) (21) Application number: 11157523.9

More information

(12) Patent Application Publication (10) Pub. No.: US 2012/ A1. MOHAPATRA (43) Pub. Date: Jul. 5, 2012

(12) Patent Application Publication (10) Pub. No.: US 2012/ A1. MOHAPATRA (43) Pub. Date: Jul. 5, 2012 US 20120169931A1 (19) United States (12) Patent Application Publication (10) Pub. No.: US 2012/0169931 A1 MOHAPATRA (43) Pub. Date: Jul. 5, 2012 (54) PRESENTING CUSTOMIZED BOOT LOGO Publication Classification

More information

(12) United States Patent

(12) United States Patent (12) United States Patent Imai et al. USOO6507611B1 (10) Patent No.: (45) Date of Patent: Jan. 14, 2003 (54) TRANSMITTING APPARATUS AND METHOD, RECEIVING APPARATUS AND METHOD, AND PROVIDING MEDIUM (75)

More information

TOWARD A FOCUSED MARKET William Bricken September A variety of potential markets for the CoMesh product. TARGET MARKET APPLICATIONS

TOWARD A FOCUSED MARKET William Bricken September A variety of potential markets for the CoMesh product. TARGET MARKET APPLICATIONS TOWARD A FOCUSED MARKET William Bricken September 2002 A variety of potential markets for the CoMesh product. POTENTIAL TARGET MARKET APPLICATIONS set-top boxes direct broadcast reception signal encoding

More information

(12) Patent Application Publication (10) Pub. No.: US 2010/ A1

(12) Patent Application Publication (10) Pub. No.: US 2010/ A1 US 20100079670A1 (19) United States (12) Patent Application Publication (10) Pub. No.: US 2010/0079670 A1 Frazier et al. (43) Pub. Date: Apr. 1, 2010 (54) MULTI-VIEW CONTENT CASTING SYSTEMS Publication

More information

Systems and methods of camera-based fingertip tracking

Systems and methods of camera-based fingertip tracking University of Central Florida UCF Patents Patent Systems and methods of camera-based fingertip tracking 6-12-2012 Andrew Sugaya University of Central Florida Find similar works at: http://stars.library.ucf.edu/patents

More information

(12) United States Patent (10) Patent N0.2 US 7,429,988 B2 Gonsalves et a]. (45) Date of Patent: Sep. 30, 2008

(12) United States Patent (10) Patent N0.2 US 7,429,988 B2 Gonsalves et a]. (45) Date of Patent: Sep. 30, 2008 US007429988B2 (12) United States Patent (10) Patent N0.2 US 7,429,988 B2 Gonsalves et a]. (45) Date of Patent: Sep. 30, 2008 (54) METHODS AND APPARATUS FOR 5,786,776 A 7/1998 Kisaichi et a1. CONVENIENT

More information

(12) Patent Application Publication (10) Pub. No.: US 2006/ A1

(12) Patent Application Publication (10) Pub. No.: US 2006/ A1 (19) United States US 20060097752A1 (12) Patent Application Publication (10) Pub. No.: Bhatti et al. (43) Pub. Date: May 11, 2006 (54) LUT BASED MULTIPLEXERS (30) Foreign Application Priority Data (75)

More information

(12) Patent Application Publication (10) Pub. No.: US 2010/ A1

(12) Patent Application Publication (10) Pub. No.: US 2010/ A1 US 2010.0097.523A1. (19) United States (12) Patent Application Publication (10) Pub. No.: US 2010/0097523 A1 SHIN (43) Pub. Date: Apr. 22, 2010 (54) DISPLAY APPARATUS AND CONTROL (30) Foreign Application

More information

(12) United States Patent

(12) United States Patent USOO9578298B2 (12) United States Patent Ballocca et al. (10) Patent No.: (45) Date of Patent: US 9,578,298 B2 Feb. 21, 2017 (54) METHOD FOR DECODING 2D-COMPATIBLE STEREOSCOPIC VIDEO FLOWS (75) Inventors:

More information

(12) United States Patent (10) Patent No.: US 6,501,230 B1

(12) United States Patent (10) Patent No.: US 6,501,230 B1 USOO65O123OB1 (12) United States Patent (10) Patent No.: Feldman (45) Date of Patent: Dec. 31, 2002 (54) DISPLAY WITH AGING CORRECTION OTHER PUBLICATIONS CIRCUIT Salam, OLED and LED Displays with Autonomous

More information

??? Introduction. Learning Objectives. On completion of this chapter you will be able to:

??? Introduction. Learning Objectives. On completion of this chapter you will be able to: Introduction??? Learning Objectives On completion of this chapter you will be able to: 1. Construct the block diagram for Fibre Optic Communication System. 2. Mention the sources which are used for transmission

More information

Page 1 of 14 HTC EXHIBIT 1001

Page 1 of 14 HTC EXHIBIT 1001 111111 1111111111111111111111111111111111111111111111111111111111111 US008050711B2 c12) United States Patent Wang et al. (10) Patent No.: (45) Date of Patent: *Nov. 1, 2011 (54) METHODS, SYSTEMS AND APPARATUS

More information

(12) United States Patent (10) Patent No.: US 9, B1

(12) United States Patent (10) Patent No.: US 9, B1 USOO9658462B1 (12) United States Patent () Patent No.: US 9,658.462 B1 Duffy (45) Date of Patent: May 23, 2017 (54) METHODS AND SYSTEMS FOR (58) Field of Classification Search MANUFACTURING AREAR PROJECTION

More information

(12) Patent Application Publication (10) Pub. No.: US 2008/ A1

(12) Patent Application Publication (10) Pub. No.: US 2008/ A1 US 2008O1891. 14A1 (19) United States (12) Patent Application Publication (10) Pub. No.: US 2008/0189114A1 FAIL et al. (43) Pub. Date: Aug. 7, 2008 (54) METHOD AND APPARATUS FOR ASSISTING (22) Filed: Mar.

More information

(12) United States Patent

(12) United States Patent (12) United States Patent USOO7332664B2 (10) Patent No.: US 7,332,664 B2 Yung (45) Date of Patent: Feb. 19, 2008 (54) SYSTEM AND METHOD FOR MUSICAL 6,072,113 A 6/2000 Tohgi et al. INSTRUMENT EDUCATION

More information

(12) United States Patent

(12) United States Patent (12) United States Patent Roberts et al. USOO65871.89B1 (10) Patent No.: (45) Date of Patent: US 6,587,189 B1 Jul. 1, 2003 (54) (75) (73) (*) (21) (22) (51) (52) (58) (56) ROBUST INCOHERENT FIBER OPTC

More information

Design Project: Designing a Viterbi Decoder (PART I)

Design Project: Designing a Viterbi Decoder (PART I) Digital Integrated Circuits A Design Perspective 2/e Jan M. Rabaey, Anantha Chandrakasan, Borivoje Nikolić Chapters 6 and 11 Design Project: Designing a Viterbi Decoder (PART I) 1. Designing a Viterbi

More information

(12) United States Patent

(12) United States Patent USOO9609033B2 (12) United States Patent Hong et al. (10) Patent No.: (45) Date of Patent: *Mar. 28, 2017 (54) METHOD AND APPARATUS FOR SHARING PRESENTATION DATA AND ANNOTATION (71) Applicant: SAMSUNGELECTRONICS

More information

Module 8 VIDEO CODING STANDARDS. Version 2 ECE IIT, Kharagpur

Module 8 VIDEO CODING STANDARDS. Version 2 ECE IIT, Kharagpur Module 8 VIDEO CODING STANDARDS Lesson 27 H.264 standard Lesson Objectives At the end of this lesson, the students should be able to: 1. State the broad objectives of the H.264 standard. 2. List the improved

More information

USOO595,3488A United States Patent (19) 11 Patent Number: 5,953,488 Seto (45) Date of Patent: Sep. 14, 1999

USOO595,3488A United States Patent (19) 11 Patent Number: 5,953,488 Seto (45) Date of Patent: Sep. 14, 1999 USOO595,3488A United States Patent (19) 11 Patent Number: Seto () Date of Patent: Sep. 14, 1999 54 METHOD OF AND SYSTEM FOR 5,587,805 12/1996 Park... 386/112 RECORDING IMAGE INFORMATION AND METHOD OF AND

More information

Tone Insertion To Indicate Timing Or Location Information

Tone Insertion To Indicate Timing Or Location Information Technical Disclosure Commons Defensive Publications Series December 12, 2017 Tone Insertion To Indicate Timing Or Location Information Peter Doris Follow this and additional works at: http://www.tdcommons.org/dpubs_series

More information

(12) Patent Application Publication (10) Pub. No.: US 2005/ A1. RF Component. OCeSSO. Software Application. Images from Camera.

(12) Patent Application Publication (10) Pub. No.: US 2005/ A1. RF Component. OCeSSO. Software Application. Images from Camera. (19) United States US 2005O169537A1 (12) Patent Application Publication (10) Pub. No.: US 2005/0169537 A1 Keramane (43) Pub. Date: (54) SYSTEM AND METHOD FOR IMAGE BACKGROUND REMOVAL IN MOBILE MULT-MEDIA

More information

(12) Patent Application Publication (10) Pub. No.: US 2005/ A1

(12) Patent Application Publication (10) Pub. No.: US 2005/ A1 (19) United States US 2005O105810A1 (12) Patent Application Publication (10) Pub. No.: US 2005/0105810 A1 Kim (43) Pub. Date: May 19, 2005 (54) METHOD AND DEVICE FOR CONDENSED IMAGE RECORDING AND REPRODUCTION

More information

6.111 Project Proposal IMPLEMENTATION. Lyne Petse Szu-Po Wang Wenting Zheng

6.111 Project Proposal IMPLEMENTATION. Lyne Petse Szu-Po Wang Wenting Zheng 6.111 Project Proposal Lyne Petse Szu-Po Wang Wenting Zheng Overview: Technology in the biomedical field has been advancing rapidly in the recent years, giving rise to a great deal of efficient, personalized

More information

The software concept. Try yourself and experience how your processes are significantly simplified. You need. weqube.

The software concept. Try yourself and experience how your processes are significantly simplified. You need. weqube. You need. weqube. weqube is the smart camera which combines numerous features on a powerful platform. Thanks to the intelligent, modular software concept weqube adjusts to your situation time and time

More information

Mid Term Papers. Fall 2009 (Session 02) CS101. (Group is not responsible for any solved content)

Mid Term Papers. Fall 2009 (Session 02) CS101. (Group is not responsible for any solved content) Fall 2009 (Session 02) CS101 (Group is not responsible for any solved content) Subscribe to VU SMS Alert Service To Join Simply send following detail to bilal.zaheem@gmail.com Full Name Master Program

More information

(12) Patent Application Publication (10) Pub. No.: US 2011/ A1

(12) Patent Application Publication (10) Pub. No.: US 2011/ A1 (19) United States US 2011 0320948A1 (12) Patent Application Publication (10) Pub. No.: US 2011/0320948 A1 CHO (43) Pub. Date: Dec. 29, 2011 (54) DISPLAY APPARATUS AND USER Publication Classification INTERFACE

More information

The BBC micro:bit: What is it designed to do?

The BBC micro:bit: What is it designed to do? The BBC micro:bit: What is it designed to do? The BBC micro:bit is a very simple computer. A computer is a machine that accepts input, processes this according to stored instructions and then produces

More information

Innovative Rotary Encoders Deliver Durability and Precision without Tradeoffs. By: Jeff Smoot, CUI Inc

Innovative Rotary Encoders Deliver Durability and Precision without Tradeoffs. By: Jeff Smoot, CUI Inc Innovative Rotary Encoders Deliver Durability and Precision without Tradeoffs By: Jeff Smoot, CUI Inc Rotary encoders provide critical information about the position of motor shafts and thus also their

More information

Cambridge International Examinations Cambridge International General Certificate of Secondary Education

Cambridge International Examinations Cambridge International General Certificate of Secondary Education www.xtremepapers.com Cambridge International Examinations Cambridge International General Certificate of Secondary Education *5619870491* COMPUTER SCIENCE 0478/11 Paper 1 Theory May/June 2015 1 hour 45

More information

(12) Patent Application Publication (10) Pub. No.: US 2004/ A1

(12) Patent Application Publication (10) Pub. No.: US 2004/ A1 (19) United States US 004063758A1 (1) Patent Application Publication (10) Pub. No.: US 004/063758A1 Lee et al. (43) Pub. Date: Dec. 30, 004 (54) LINE ON GLASS TYPE LIQUID CRYSTAL (30) Foreign Application

More information

(10) Patent N0.: US 6,301,556 B1 Hagen et al. (45) Date of Patent: *Oct. 9, 2001

(10) Patent N0.: US 6,301,556 B1 Hagen et al. (45) Date of Patent: *Oct. 9, 2001 (12) United States Patent US006301556B1 (10) Patent N0.: US 6,301,556 B1 Hagen et al. (45) Date of Patent: *Oct. 9, 2001 (54) REDUCING SPARSENESS IN CODED (58) Field of Search..... 764/201, 219, SPEECH

More information