UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD APPLE INC., Petitioner v. IMMERSION CORPORATION Patent Owner U.S. P

Size: px
Start display at page:

Download "UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD APPLE INC., Petitioner v. IMMERSION CORPORATION Patent Owner U.S. P"

Transcription

1 UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD, Petitioner v. IMMERSION CORPORATION Patent Owner U.S. Patent No. 8,773,356 Filing Date: January 31, 2012 Issue Date: July 8, 2014 Title: Method and Apparatus for Providing Tactile Sensations Inter Partes Review No.: (Unassigned) DECLARATION OF DR. RICHARD T. MIHRAN WEST\ EXHIBIT PAGE 1

2 EXHIBIT LIST Exhibit Description No U.S. Patent No. 8,773,356 ( 356 patent ) 1102 Reserved Administrative Judge s Construction of Terms ITC Investigation No. 337-TA File History of U.S. Patent No. 8,773, U.S. Provisional App. No. 60/335,493 ( First Provisional ) 1106 U.S. Provisional App. No. 60/399,883 ( Second Provisional ) 1107 WO 2002/12991 A1 ( Fukumoto WO ) 1108 Translation of WO 2002/12991 A1 Fukumoto WO 1109 U.S. Pat. App. Pub. No. 2002/ ( Fukumoto US ) 1110 Japanese Published Application No. H Translation of Japanese Published Application No. H ( Tsuji ) 1112 U.S. Pat. App. Pub. No. US2008/ ( Rosenberg 350 ) IBM Model M Keyboard Release, April 1986 (available at 01.ibm.com/common/ssi/ShowDoc.wss?docURL=/common/ssi/rep_ca/ 8/877/ENUSZG /index.html&lang=en&request_locale=en) ( IBM Release ) 1114 IBM Model M Keyboard Photograph U.S. Pat. No. 5,575,576 ( Roysden ) WEST\ i EXHIBIT PAGE 2

3 I. INTRODUCTION 1. I have been retained by counsel for Apple Inc. as an expert witness in the above-captioned proceeding. I have been asked to provide my opinion about the patentability of claims 1-26 of U.S. Patent No. 8,773,356 (the 356 patent ). 2. I have been retained at my normal hourly rate of $600 per hour. No part of my compensation is dependent upon the outcome of this matter or the specifics of my testimony. A. Background and Qualifications 3. My curriculum vitae ( CV ) is attached as Appendix A. 4. I am a Professor Adjunct in the Department of Electrical and Computer Engineering at the University of Colorado at Boulder, where I have been on the faculty since I teach a wide variety of classes at the undergraduate and graduate level covering general electrical and computer engineering theory and practice, including circuit theory, microelectronics, signal processing, and medical devices and systems. Many of these classes incorporate both lecture and laboratory components that include hardware and software design. 5. Courses I have taught include topics such as analog and digital circuit theory and design, microelectronics, signal processing, radio-frequency identification devices, miniaturized implantable medical devices incorporating embedded systems, and optics and optical electronics, including semiconductor WEST\ EXHIBIT PAGE 3

4 laser diodes, Fourier optics, non-linear optics, optical sensors, and wave propagation in optical fibers. Many of these courses include components directly related to the design and implementation of portable electronic devices such as smartphones and computers, as well as subject matter relating to haptic feedback that may be used in such devices. These courses further include components and concepts directly relevant to electronic devices and systems and their interfaces with other devices, including communications networks, general principles of wired and wireless RF communications, and data signal modulation and encoding in a variety of applications. 6. The devices and methods claimed in the 356 patent encompass several technology areas, including the basic architecture of smart devices/embedded systems (controller, memory, display, data structures, etc.), general principles of tactile feedback (physiology of human tactile perception, such as sensitivity to vibration frequency/amplitude), haptic actuators and drive signals (e.g. piezoelectric and voice-coil actuators and control of their signal sources), and the integration of these elements within the context of the user interface. 7. With respect to basic embedded systems implementation, I have been involved in microcontroller-based designs of portable data acquisition, processing and computing devices for over 35 years, utilizing commercial microprocessors manufactured by Intel, Motorola, Zilog, Microchip, among others. These devices WEST\ EXHIBIT PAGE 4

5 generally included various forms of displays and user interfaces as part of their implementation, along with various sensors and actuators. Research projects I have directed involving microprocessor-based systems include the development of embedded system biosensor and immunoassay devices, radar signal processing devices, spread-spectrum data telemetry devices, and microprocessor-controlled drug infusion devices utilizing various mechanical actuators. 8. With respect to the integration of tactile feedback with electronic devices, I have an extensive background in neuroscience and electrophysiology, including performing research on the effects of various forms of mechanical stimuli on nerve cells. This research included the development of systems to deliver low-level mechanical stimuli to neural and other tissues using both direct mechanical/vibratory stimulation, as well as mechanical stimuli delivered using pulsed acoustic/ultrasonic stimuli. This work included the design and implementation of the mechanical actuators, including voice-coil and piezoelectric actuators, to deliver the mechanical stimuli to the neural tissue over a broad range of frequencies and amplitudes. 9. I have further taught courses at the undergraduate and graduate level in basic neuronal electrophysiology, as well as the development of implantable medical devices with neural interfaces, including cochlear and retinal implants, spinal cord stimulation devices, and motor neuron stimulation devices for WEST\ EXHIBIT PAGE 5

6 Functional Electrical Stimulation (FES). 10. As part of my faculty role at the University of Colorado, I participate in the supervision of doctoral research performed by graduate students as part of obtaining their doctoral degrees, including the development of a haptic interfaces and communications for providing touch/tactile feedback for virtual environments during the late 1990 s. 11. Many of these research projects have further involved the development of devices utilized in systems for acquiring, processing, storing and retrieving data, as well as computational algorithms and analytical techniques implemented in both software and firmware on a variety of computing platforms, including embedded microprocessor systems and personal computers (PCs). I am an inventor on three issued U.S. patents and one Canadian patent associated with some of these activities, two involving computer-based Doppler radar signal processing and data analysis, and two involving data telemetry utilizing spread spectrum wireless links and database analysis systems for agricultural management. 12. Since obtaining my Ph.D. in 1990, I have actively consulted in industry in many areas of technology development, analysis and assessment, directed to both product development and analysis of intellectual property portfolios, patent infringement and validity. The fields of technology in which I WEST\ EXHIBIT PAGE 6

7 have consulted and/or served as a technical expert include computers, storage and data systems (Hewlett Packard, Maxtor Corp.); Smart Card and Radio Frequency identification systems (HID/AssaAbloy, Inc.; Vue Technology Corp.); Smartphones, telecommunications and networking and associated devices (e.g. AT&T, Kyocera Wireless, Sierra Wireless, Nokia, Verizon Wireless, Sprint, US Cellular, Time Warner Cable, Comcast Corp., Palm, Inc., Lucent Corp., Nortel, Inc., Qwest Corp. and others); 3-D seismic data analysis and imaging software (Terraspark Geosciences, Inc.); electronic securities trading networks (Sonic Trading, Inc.); medical devices and systems (e.g. Boston Scientific, St. Jude Medical); vision-based surgical tool tracking and navigation systems (Image Guided Technologies, Inc., formerly Pixsys, Inc.); and others. 13. I have served as an expert witness in many patent litigation matters in the areas of computers, data storage, telecommunications, medical devices, and others. I have been admitted and recognized in U.S. District Courts as a technical expert in seven separate patent trials, including most recently in a patent matter in the District of Delaware in which I served a technical expert witness addressing two patents covering RFID transponders. 14. I have also previously been admitted and recognized as a technical expert by the International Trade Commission (ITC) in Washington D.C., where I provided both a technology tutorial and subsequent testimony at trial in a patent WEST\ EXHIBIT PAGE 7

8 infringement complaint involving multiple patents directed to the function and design of smart phones and other portable computing devices capable of voice and data communications over cellular and other wireless networks. 15. I have also been recognized and admitted in the Federal District of Colorado as a technical expert and provided testimony at trial in the field of implantable radio-frequency identification transponders and readers (District of Colorado). 16. I have further been admitted and recognized as a technical expert in wireless communications in the Northern District of California, San Jose Division where I served as a technical expert witness on behalf of several manufacturers of wireless networking equipment. The accused products in that matter included PCMCIA wireless network adapters used to provide wireless connectivity to a variety of data networks, including Ethernet and cellular networks. 17. I have also been admitted and recognized as a technical expert in the Eastern District of Virginia in which I served as a technical expert witness on behalf of several major cellular service providers and smart phone manufacturers. The accused products in that matter included USB and PCMCIA wireless network adapters used to provide wireless Internet connectivity to computers over cellular data networks, such as GSM and CDMA based networks. 18. I have also been admitted and recognized as a technical expert in the WEST\ EXHIBIT PAGE 8

9 Eastern District of Texas in which I served as a technical expert witness addressing patents directed to integrated microcontrollers and associated network adapter modules used to provide Ethernet communications. 19. I have also been recognized and admitted in Federal District Courts as a technical expert witness and provided testimony at trial in the field of digital signal processing/optoelectronic image processing (Eastern District of Virginia), and operational algorithms executed by firmware in embedded microprocessor systems included in implantable medical devices (Southern District of Indiana). 20. I received a BS in Electrical Engineering and Applied Physics from Case Western Reserve University, Cleveland, Ohio in I further received an MS in Electrical and Computer Engineering and a Ph.D. in Electrical Engineering from the University of Colorado at Boulder in 1988 and 1990, respectively. B. Information Considered 21. My opinions are based on my years of education, research, and experience, as well as my study of relevant materials. In forming my opinions, I have considered the materials identified in this declaration and in the Petition. 22. I may rely upon these materials and/or additional materials to respond to arguments raised by Immersion. I may also consider additional documents and information in forming any necessary opinions, including documents that may have not yet been provided to me. WEST\ EXHIBIT PAGE 9

10 23. My analysis of the materials produced in this matter is ongoing and I will continue to review any new material as it is provided. This declaration represents only those opinions I have formed to date. I reserve the right to revise, supplement, or amend my opinions stated herein based on new information and on my continuing analysis of the materials already provided. II. LEGAL STANDARDS A. Legal Standards for Prior Art 24. I understand that a patent or other publication must first qualify as prior art before it can be used to invalidate a patent claim. 25. I understand that a U.S. or foreign patent qualifies as prior art to an asserted patent if the date of issuance of the patent is prior to the invention of the asserted patent. I further understand that a printed publication, such as an article published in a magazine or trade publication, qualifies as prior art to an asserted patent if the date of publication is prior to the invention of the asserted patent. 26. I understand that a U.S. or foreign patent also qualifies as prior art to an asserted patent if the date of issuance of the patent is more than one year before the filing date of the asserted patent. I further understand that a printed publication, such as an article published in a magazine or trade publication, constitutes prior art to an asserted patent if the publication occurs more than one WEST\ EXHIBIT PAGE 10

11 year before the filing date of the asserted patent. 27. I understand that a U.S. patent qualifies as prior art to the asserted patent if the application for that patent was filed in the United Stated before the invention of the asserted patent. B. Legal Standards for Anticipation 28. I understand that documents and materials that qualify as prior art can be used to invalidate a patent claim via anticipation or obviousness. 29. I understand that, once the claims of a patent have been properly construed, the second step in determining anticipation of a patent claim requires a comparison of the properly construed claim language to the prior art on a limitation-by-limitation basis. 30. I understand that a prior art reference anticipates an asserted claim, and thus renders the claim invalid, if all elements of the claim are disclosed in that prior art reference, either explicitly or inherently (i.e., necessarily present). 31. I understand that anticipation in an inter partes review must be shown by a preponderance of the evidence. C. Legal Standards for Obviousness 32. I understand that even if a patent is not anticipated, it is still invalid if WEST\ EXHIBIT PAGE 11

12 the differences between the claimed subject matter and the prior art are such that the subject matter as a whole would have been obvious at the time the invention was made to a person of ordinary skill in the pertinent art. 33. I understand that a person of ordinary skill in the art provides a reference point from which the prior art and claimed invention should be viewed. This reference point prevents one from using his or her own insight or hindsight in deciding whether a claim is obvious. 34. I also understand that an obviousness determination includes the consideration of various factors such as (1) the scope and content of the prior art, (2) the differences between the prior art and the asserted claims, (3) the level of ordinary skill in the pertinent art, and (4) the existence of secondary considerations such as commercial success, long-felt but unresolved needs, failure of others, etc. 35. I understand that an obviousness evaluation can be based on a combination of multiple prior art references. I understand that the prior art references themselves may provide a suggestion, motivation, or reason to combine, but other times the nexus linking two or more prior art references is simple common sense. I further understand that obviousness analysis recognizes that market demand, rather than scientific literature, often drives innovation, and that a motivation to combine references may be supplied by the direction of the WEST\ EXHIBIT PAGE 12

13 marketplace. 36. I understand that if a technique has been used to improve one device, and a person of ordinary skill in the art would recognize that it would improve similar devices in the same way, using the technique is obvious unless its actual application is beyond his or her skill. 37. I also understand that practical and common sense considerations should guide a proper obviousness analysis, because familiar items may have obvious uses beyond their primary purposes. I further understand that a person of ordinary skill in the art looking to overcome a problem will often be able to fit together the teachings of multiple publications. I understand that obviousness analysis therefore takes into account the inferences and creative steps that a person of ordinary skill in the art would employ under the circumstances. 38. I understand that a particular combination may be proven obvious merely by showing that it was obvious to try the combination. For example, when there is a design need or market pressure to solve a problem and there are a finite number of identified, predictable solutions, a person of ordinary skill has good reason to pursue the known options within his or her technical grasp because the result is likely the product not of innovation but of ordinary skill and common sense. WEST\ EXHIBIT PAGE 13

14 39. The combination of familiar elements according to known methods is likely to be obvious when it does no more than yield predictable results. When a work is available in one field of endeavor, design incentives and other market forces can prompt variations of it, either in the same field or a different one. If a person of ordinary skill can implement a predictable variation, the patent claim is likely obvious. 40. It is further my understanding that a proper obviousness analysis focuses on what was known or obvious to a person of ordinary skill in the art, not just the patentee. Accordingly, I understand that any need or problem known in the field of endeavor at the time of invention and addressed by the patent can provide a reason for combining the elements in the manner claimed. 41. I understand that a claim can be obvious in light of a single reference, without the need to combine references, if the elements of the claim that are not found explicitly or inherently in the reference can be supplied by the common sense of one of skill in the art. 42. I understand that secondary indicia of non-obviousness may include (1) a long felt but unmet need in the prior art that was satisfied by the invention of the patent; (2) commercial success of processes covered by the patent; (3) unexpected results achieved by the invention; (4) praise of the invention by others WEST\ EXHIBIT PAGE 14

15 skilled in the art; (5) taking of licenses under the patent by others; (6) deliberate copying of the invention; (7) failure of others to find a solution to the long felt need; and (8) skepticism by experts. 43. I also understand that there must be a relationship between any such secondary considerations and the invention. I further understand that contemporaneous and independent invention by others is a secondary consideration supporting an obviousness determination. 44. In sum, my understanding is that prior art teachings are properly combined where a person of ordinary skill in the art having the understanding and knowledge reflected in the prior art and motivated by the general problem facing the inventor, would have been led to make the combination of elements recited in the claims. Under this analysis, the prior art references themselves, or any need or problem known in the field of endeavor at the time of the invention, can provide a reason for combining the elements of multiple prior art references in the claimed manner. 45. I understand that obviousness in an inter partes review must be shown by a preponderance of the evidence. WEST\ EXHIBIT PAGE 15

16 III. OVERVIEW OF THE 356 PATENT A. Technology Background 46. As discussed in this section and throughout this report, the technology of the asserted claims was well known in the prior art. 47. The 356 patent is directed to well-known human computer interaction components and their interconnections, namely, a controller that receives a signal from an input device comprising a touchscreen, and outputs a corresponding signal to an actuator that creates a desired tactile sensation using stored haptic effect data. The asserted claims of the 356 patent all require that the haptic effect data used to generate the actuator signal is stored in a lookup table. 48. To provide background for the element-by-element analysis of the claims to follow, below I will present an overview of the state of the art existing as of the time of the alleged invention relating to computing devices having touchscreens, touchscreens coupled to actuators to provide haptic feedback, and lookup table that store haptic effect data. As I will describe below, all of these technologies and techniques for applying them to touchscreen devices to provide haptic feedback were well-known to those of ordinary skill in the art at the time of the alleged invention claimed in the 356 patent, and a person of ordinary skill in the art. WEST\ EXHIBIT PAGE 16

17 a. Touchscreens were well-known and widely applied in computing devices at the time of the alleged invention 49. A touchscreen that might be utilized with computing devices such as smart phones, PDAs, tablets, and the like comprises a touch-sensitive surface overlaying a flat-panel display, such that contact with the display surface by an object such as a stylus or fingertip can be detected and localized with respect to icons, text, or other information appearing on the display. At the time of the alleged invention claimed in the 356 patent, a variety of technologies were known and available for implementing touchscreen devices. While many variations of such implementations were known, among the more prevalent implementations were those using resistive, capacitive, optical, or acoustical (SAW) designs. 50. Touchscreen technologies were well-known and widely deployed well before the alleged invention claimed in the 356 patent. The 356 patent admits in the Background section of the specification that utilizing touchscreens in computing devices such as mobile telephones and PDAs was well-known to those of ordinary skill in the art at the time of the alleged invention: Conventional electronic devices, such as mobile telephones and Personal Digital Assistants (PDAs), include visual displays. A user of such devices interacts with the visual display using any one of a number of input devices... The user provides instructions, responses, and other input to the device using such input devices. Ex at 1:30-38, (emphasis added). WEST\ EXHIBIT PAGE 17

18 When a flat surface interface device is used, such as a touchpad for a computer or PDA, these simple mechanical cues are unavailable to the user. Often, touchpads are combined with flat-panel display screens that display one or more graphically generated buttons or softkeys. Normally, the softkeys are visible through the touchpad. A user's contact with the touchpad in an area defined by a softkey provides the electronic device having the touchpad with the input associated with that softkey. Ex at 1:62-2:3, (emphasis added). 51. The fact that touchscreen technology was well-known and widely used in computing devices prior to the alleged invention is further demonstrated by its disclosure in prior art references, as well as its utilization in commercial devices. 52. A further example demonstrating that touchscreen devices were well known prior to the invention claimed in the 356 patent is found in Tsuji: Devices where a touch panel is arranged on a display are in wide use as one type of information display device having an operation input function. Touch panels are extremely thin, and have the advantage of providing a high degree of freedom for selecting an area that can be used as a switch. Ex at [0002], (emphasis added). 53. One of the embodiments of the system disclosed in Tsuji comprises an Automatic Teller Machine (ATM) having a touchscreen, as shown in Figure 1. This figure, along with accompanying description of its touchscreen, is reproduced below: WEST\ EXHIBIT PAGE 18

19 Ex at Figure 1. FIG. 1 is a perspective view of an automatic teller machine (ATM) 1 as an example of a system incorporating an information display device 100 of a first embodiment of the present invention. The automatic teller machine 1 is provided with a cashier section 3 and a card and bank passbook insertion section 4 on a front surface of a chassis 2. The machine is also provided with an information input and output section 5, and the information display device 100 is used in the information input and output section 5. Ex at [0041], (emphasis added). 54. A further embodiment of the system disclosed in Tsuji comprises a hand-held information display device having a touchscreen, as shown in Figure 17. This figure, along with accompanying description of its touchscreen, is reproduced below: FIG. 17 is a perspective drawing of the exterior of an information display device 200 according to a third embodiment of the present invention. WEST\ EXHIBIT PAGE 19

20 Ex at [0040]. The operating regions R1 through R4 displayed by the liquid crystal display panel are visible through the operating surface 11 in FIG. 17. Typically, these operating regions R1 through R4 are displayed along both sides of the operating surface. An operator grips the housing 201 by both sides, as illustrated by the broken lines in FIG. 17, and performs operations by pressing these operating regions R1 through R4 with his/her thumbs. When the position of this pressing operation is sensed, if the pressing force thereof is larger than a predetermined threshold, the operation input is accepted, a displayed object 210 (FIG. 18) on a screen changes, and the operating surface 11 is vibrated or slightly displaced based on a predetermined mode. The operation at this point is the same as in the first and second embodiments. Ex at [0146], (emphasis added). Ex at Figure A further example from prior art demonstrating that touchscreen devices were well known prior to the invention claimed in the 356 patent is found WEST\ EXHIBIT PAGE 20

21 in Fukumoto. Fukumoto discloses a touch sensitive input device configured to output a sensor signal indicating an object contacting the touch-sensitive input device. This is shown, for example, in Figure 1 of Fukumoto, which is reproduced below, along with accompanying description of its touchscreen: Ex at Figure 1. FIG. 1 is a perspective view illustrating the appearance of a PDA 10 according to a first embodiment of the present invention. In the figure, a transparent touch panel 102 is overlaid on a display screen of a liquid crystal display panel 103a covering an opening of a main case 101. A user inputs operation instructions to the PDA 10 by touching the touch panel 102 by his or her fingertip. Ex at [0146]. b. The use of haptic effects with touchscreens to provide haptic feedback was well-known in computing devices at the time of the alleged invention 56. As shown above, the use of touchscreens of various forms and designs WEST\ EXHIBIT PAGE 21

22 was well known to those of ordinary skill in the art at the time of the alleged invention, and had been described extensively in the literature, as well as utilized in commercial devices such as smart phones, PDAs, and tablet computing devices. 57. Moreover, the use of haptic effects in conjunction with such touchscreens to provide haptic feedback to a user was also well known to those of ordinary skill in the art at the time of the alleged invention. This is supported by the 356 patent itself, as well as demonstrated by the disclosures of Tsuji and Fukumoto. 58. The prior art is also replete with disclosures of the use of haptic effects in conjunction with touch screens to provide tactile feedback in a variety of computing devices. For example, Tsuji demonstrates that touchscreen devices capable of providing haptic feedback were well known prior to the invention claimed in the 356 patent: Devices where a touch panel is arranged on a display are in wide use as one type of information display device having an operation input function. Touch panels are extremely thin, and have the advantage of providing a high degree of freedom for selecting an area that can be used as a switch. Ex at [0002], (emphasis added). 59. One of the embodiments of the system disclosed in Tsuji comprises a hand-held information display device having a touchscreen which provides haptic effects in response to user interaction with displayed graphical objects, and is WEST\ EXHIBIT PAGE 22

23 shown in Figures These figures, along with accompanying description of haptic feedback provided to a user contacting the touchscreen, is reproduced below: FIG. 17 is a perspective drawing of the exterior of an information display device 200 according to a third embodiment of the present invention. Ex at [0040] The operating regions R1 through R4 displayed by the liquid crystal display panel are visible through the operating surface 11 in FIG. 17. Typically, these operating regions R1 through R4 are displayed along both sides of the operating surface. An operator grips the housing 201 by both sides, as illustrated by the broken lines in FIG. 17, and performs operations by pressing these operating regions R1 through R4 with his/her thumbs. When the position of this pressing operation is sensed, if the pressing force thereof is larger than a predetermined threshold, the operation input is accepted, a displayed object 210 (FIG. 18) on a screen changes, and the operating surface 11 is vibrated or slightly displaced based on a predetermined mode. The operation at this point is the same as in the first and second embodiments. WEST\ EXHIBIT PAGE 23

24 Ex at [0146], (emphasis added). 60. Thus, as further demonstrated by Tsuji and summarized above, use of actuators coupled to a touchscreen to generate haptic effects and to provide haptic feedback to a user as they contact the screen while interacting with objects displayed on the touchscreen was well-known to those of ordinary skill in the art at the time of the invention claimed in the 356 patent. 61. A further example demonstrating that touchscreen devices capable of providing haptic feedback were well known prior to the invention claimed in the 356 patent is found in Fukumoto. Fukumoto discloses a touch sensitive input device configured to output a sensor signal indicating an object contacting the touch-sensitive input device, and generate haptic feedback in response. This is shown, for example, in Figure 1 of Fukumoto, which is reproduced below, along with accompanying description of its touchscreen: WEST\ EXHIBIT PAGE 24

25 Ex & Ex at Fig. 1. FIG. 1 is a perspective view illustrating the appearance of a PDA 10 according to a first embodiment of the present invention. In the figure, a transparent touch panel 102 is overlaid on a display screen of a liquid crystal display panel 103a covering an opening of a main case 101. A user inputs operation instructions to the PDA 10 by touching the touch panel 102 by his or her fingertip. Ex at 13; Ex at [0146] Fukumoto further discloses an actuator coupled to the touch-sensitive input device comprising transparent touch panel 102 as shown above, to provide haptic effects. This is shown, for example, in Figure 2 of Fukumoto, reproduced below, with accompanying description: Ex & Ex at Figure 2. Next, FIG. 2 is a block diagram illustrating the hardware configuration of the PDA 10 shown in FIG. 1. As shown in this figure, 1 For convenience, in this report I include parallel cites to the Fukumoto US application along with Fukumoto WO cites because the US application references narrower paragraphs rather than pages. The quoted language is from Fukumoto US. WEST\ EXHIBIT PAGE 25

26 the PDA 10 has a touch panel 102, a display unit 103, a key input unit 111, a memory 112, a CPU (central processing unit) 113, a drive signal generation circuit 114, and an oscillatory actuator 115. Ex at 13; Ex at [0147]. 63. Figure 3 of Fukumoto depicts the coupling of the oscillatory actuator 115 referenced above with the touch screen comprising touch panel 102, thereby providing haptic feedback to the user ss finger, ass shown: Ex & Ex at Figure Thus, as furtherr demonstrated by Fukumoto and summarized above, the use of actuators coupled to a touchscreen to generate haptic effects and to provide haptic feedback to a user as they contactt the screen while interacting with objects displayed on the touchscreen was well-known to those of ordinary skill in the art at the time of the invention claimed in thee 356 patent. WEST\ EXHIBIT PAGE 26

27 c. Lookup tables were well-known data structures and were used in prior art devices to store haptic effect data at the time of the alleged invention 65. A lookup table is a particular form of data structure commonly used in firmware/software design which arranges values of a function and associated keys in a table that can be efficiently accessed using a search key to the table. It is commonly used to avoid runtime computation of the function values by preloading the values into the table, which can be accessed quickly, and therefore reduce computational burdens on the system. 66. Tables such as these used in a table lookup procedure and containing haptic effect data were well known as data structures that could be utilized by computing devices having touchscreens providing haptic feedback to the user. Some examples from the prior art will be summarized below. 67. The use of a lookup table in a touchscreen device providing haptic feedback is found in Tsuji, and is depicted in Figure 13: WEST\ EXHIBIT PAGE 27

28 68. As described in Tsuji, a region determining signal SR is generated which indicates which region of the touchscreen (if any) is being contacted, and is used as one index to the table shown above in Figure 13. Ex at [ ]. In addition, a force determining signal FB is used as a second index to the table shown in Figure 13, determined by the classification of the level of force being applied to the screen. See, e.g., Ex at [ ]. 69. Based on the values of the region determining signal SR and the force determining signal FB, a table lookup procedure yields the associated signal value, S mn. In the example depicted in Figure 13 above, a lookup procedure using an SR signal corresponding to region 2 (R 2 ) and an FB signal having an operating force WEST\ EXHIBIT PAGE 28

29 classification value of F 3 yields an actuator drive mode signal S 23 : Specifically, as is illustrated in FIG. 13, with which region, R1 through R0, the region determining signal SR belongs to as a first index, and which of the operating force classifications, F1 through F4, a classification expressed by the operating force determining signal FB belongs to as a second index, the drive modes that should be selected for combinations of the first and second indices are stored in Table 72a in advance. Signals S11, S12,... in FIG. 13 are codes for selecting and defining any of a variety of drive modes like, for example, that illustrated in FIG. 14. Ex at [0109], (emphasis added). 70. The drive modes as referenced above include actuator vibration modes having specified amplitude, frequency, and durations: Fig. 14 schematically shows various drive modes stored in drive mode memory unit 73. For instance, Fig. 14 (a) shows a mode for continuous vibrations with small amplitude, and Fig. 14(b) shows a mode for continuous vibration with large amplitude. Fig 14 (c) shows a vibration mode with different frequencies from Fig. 14 (a) and (b), and Fig. 14 (d), (e) show examples in which vibrations per unit time are executed once or twice. Furthermore, Fig. 14 (f) is a vibration mode to provide only one time vibration (one shot pulse). Moreover, modes other than these will be explained later. Ex at [0109], (emphasis added). 71. Thus, as further demonstrated by Tsuji and summarized above, use lookup tables in computing devices that provide haptic feedback to a user as they contact the screen while interacting with objects displayed on the touchscreen was well-known to those of ordinary skill in the art at the time of the invention claimed in the 356 patent. WEST\ EXHIBIT PAGE 29

30 72. Another example of the use of a lookup table in computing devices that provide haptic feedback to the user is found in Fukumoto: FIG. 46 is a view illustrating a waveform data table stored in a memory in a PDA according to a seventh embodiment of the present invention. Ex at 11; Ex at [0103]. Ex & Ex at Fig Fukumoto further describes the use of this lookup table comprising waveform data table 112d in conjunction with user contact with graphical icons or touch buttons as shown in Figure 32: Further, the memory 112 of the PDA 10 according to the present embodiment stores the waveform data table 112d shown in FIG. 46. The waveform data table 112d corresponds to the example of the screen display of the touch buttons shown in FIG. 32. The waveform data table 112d stores for each touch button the area data for the touch button and the waveform data to be applied to the oscillatory actuator WEST\ EXHIBIT PAGE 30

31 115 for each of the cases when the touch button is subjected to a touch operation and when it is subjected to a pressing operation. Ex at 31; Ex at [0288]. 74. Another example of a lookup table disclosed in Fukumoto appears in Figure 33, which is reproduced below, with accompanying description: Ex & Ex at Figure 33. Next, FIG. 33 is a view illustrating a waveform data a table 112a stored in the memory 112 in the PDA 10. As shown in the figure, the waveform data table 112a stores, for each touch button displayed on the screen, area data showing the area occupied by a touch button on the touch panel 102 using XY coordinatess as well as waveform data of the drive signal to be applied to the oscillatory actuator 115 when that touch button is pressed. Ex at 28; Ex at [0262], (emphasis added). 75. Other discussions in Ex related to the use of lookup tables can be found at, e.g., [ ], [ ], [ ], [0329], [ ], [ ], [0353], [0358], [ ], [0370]; Figs. 33, 40, 42, 46, 54,, See also corresponding pages and figures of Ex at 28-30, 36-37, 40-42, Figs. 33, 40, WEST\ EXHIBIT PAGE 31

32 42, 46, 54, Thus, as further demonstrated by Fukumoto and summarized above, use of lookup tables in computing devices that provide haptic feedback to a user as they contact the screen while interacting with objects displayed on the touchscreen was well-known to those of ordinary skill in the art at the time of the invention claimed in the 356 patent. B. Summary of the 365 Patent 77. The 356 patent is directed to well-known human computer interaction components and their interconnections, namely, a controller that receives a signal from an input device and outputs a corresponding signal to an actuator that creates a desired tactile sensation. The purported novelty of the 356 patent relates to generating the actuator signal based in part on haptic effect data in a lookup table, as discussed further in the section of this report addressing its prosecution history. 78. The invention claimed in the 356 patent discloses products and processes for providing tactile sensations to input devices or electronic devices. Ex at 2: According to the 356 patent, the input devices include mechanical input devices (such as, for example, mechanical switches) and nonmechanical input devices (such as, for example, touchpads). Id. at 1: Further, the 356 patent discloses that [t]actile feedback is provided by using an WEST\ EXHIBIT PAGE 32

33 actuator or other means in communication with the input device or electronic device, and that a controller may be employed to receivee signals from the input devices and to control the actuator. Ex att 1: The patent discloses that [a]variety of feedback types and combinations may be selected,, and that portable devices such as [m]obile telephones and PDAs benefit from employing such products and processes. 79. Figures 3 and 5 of the 356 patent, which are reproducedd below, illustrate mobile telephone and PDA embodiments of the claimed invention: 80. The mobile telephone embodiment shown in Fig. 3 includes alphanumeric buttons 10, and a display screenn capable of displaying graphic objects 16 and alphanumeric information 17. Ex att 8: The PDA embodiment shown in Fig. 5 similarly includes a plurality WEST\ EXHIBIT PAGE 33

34 of mechanical type buttons 32 and a display panel 33 capable of displaying computer generated graphics. The display screen area further comprises an input device in the form of a pressure-sensitive touchpad 30. Ex at 11: The combination of display screen and pressure-sensitive touchpad may provide a plurality of software-generated buttons or keys, called softkeys 36 a-i that are utilized to provide a graphical user interface for the PDA 31. Id. at 11: A controller in the PDA (not shown in Fig. 5) is capable of determining the location on the display screen 33 that is touched by the object 24, and the softkey 36 corresponding to the touched location. Based upon this information, the controller causes the actuator 64 to provide a corresponding tactile sensation. Id. at 11: A person of ordinary skill in the art would recognize that the use of a mechanical actuator integrated into portable electronic devices such as PDAs and mobile telephones to provide tactile sensations and tactile feedback provides what is known in the art as haptic feedback. 83. As reflected in the dictionary definition above, those of skill in the art recognize that haptic sensations may include both kinesthetic or force sensations, which are perceived via the proprioceptive system, and tactile sensations, that are primarily sensed via cutaneous receptors. 84. With respect to haptic feedback sensations that can be provided in WEST\ EXHIBIT PAGE 34

35 various devices, an example of a kinesthetic sensation might be the resistive force imparted to a joystick used as an input device in a virtual environment, such as a video game. The force sensation in this case would be primarily (though not exclusively) perceived by specialized receptors (proprioceptors) in the muscles, joints and tendons. An example of a tactile sensation might be low-displacement vibrations or impulse forces imparted to a surface, which are primarily perceived via cutaneous receptors such as are found in the human fingertip. 85. A person of ordinary skill in the art would understand that the 356 patent is directed to the creation of haptic effects or sensations that are primarily tactile in nature, e.g. vibrations or impulses that are applied to mechanical keys, touchpads or touchscreens commonly utilized in portable devices. In this regard, the 356 patent therefore appears to use the word haptic interchangeably with tactile. 86. For example, the tactile sensations produced by the mechanical actuator as described in the 356 patent as cited above are elsewhere in the specification alternatively described as haptic sensations: In addition, the actuator can include a flexure, for example an arrangement of flexible material, coupled to the rotating shaft of a DC motor or step motor to transform the rotation of the motor shaft into vibrations or other haptic sensations. Ex at 3:66-4:3, (emphasis added). 87. Similarly, the tactile feedback produced by the mechanical actuator WEST\ EXHIBIT PAGE 35

36 as described in the 356 patent as cited above (e.g., id. at 1:32-35) is elsewhere alternatively described in the specification as haptic feedback: The storage memory includes a table in which input signals are associated with various haptic feedback signals. Ex at 7:67-8:3 (emphasis added). In one embodiment, in addition to providing haptic feedback to the input device, the controller 9 also sends a signal to the display 17 to cause the alphanumeric character associated with the input signal to be displayed. Ex at 10:28-31, (emphasis added). In one embodiment of the present invention, a device provides haptic feedback while navigating a menu structure, allowing a user to navigate the menu structure more efficiently, preferably without having to refer to the visual display. Ex at 18:4-9, (emphasis added). 88. The specification of the 356 patent may also use the terms tactile feedback sensations, tactile sensations, and haptic sensations interchangeably to all refer to the touch or tactile stimulation delivered by an actuator to a user via an input device, such as a switch or soft key on a display: Preferably, the actuator 6 is configured to output a plurality of distinct tactile feedback sensations to the input device 2. Suitable tactile sensations include vibrations, for example, jolts and textures, and a plurality of distinct tactile sensations can be created by varying the frequency, amplitude and waveform output by the actuator 6. The actuator 6 is selected to deliver the desired tactile sensations to the input device 2. The actuator 6 shown in FIG. 1 is a voice coil actuator. Other suitable actuators include, for example, piezo-electric actuators, eccentric mass actuators, moving magnet actuators, and friction brakes in contact with metal shafts. In addition, the actuator can WEST\ EXHIBIT PAGE 36

37 include a flexure, for example an arrangement of flexible material, coupled to the rotating shaft of a DC motor or step motor to transform the rotation of the motor shaft into vibrations or other haptic sensations. 365 patent at 3:54-4:3, (emphasis added). 89. While I will discuss the teachings of the 356 patents in greater detail in the context of my specific invalidity analysis presented later in this report, there are a number of general observations regarding the disclosure that are helpful in framing what this patent purports to be the inventive or novel aspects of the claimed method and system for providing tactile feedback. These observations include the following: The 356 patent made no inventive contribution to the design of touchscreens The 356 patent made no inventive contribution to methods or devices for displaying graphical objects on a touchscreen. The 356 patent made no inventive contribution to methods or devices for detecting contact between an objects and displayed graphical objects on a touchscreen. The 356 patent made no inventive contribution to methods or devices for determining an interaction between objects contacting the screen and displayed graphical objects on a touchscreen. The 356 patent made no inventive contribution to methods or devices WEST\ EXHIBIT PAGE 37

38 for generating an actuator signal in determining an interaction between objects contacting the screen and displayed graphical objects on a touchscreen. 90. I address each of these in turn below. a. The 356 patent made no inventive contribution to the design of touchscreens 91. As described above, touchscreens were well-known to those of ordinary skill in the art prior to the alleged invention, and were deployed in many forms across a variety of commercial devices, including PDAs, smartphones, and other computing devices. 92. The asserted claims of the 356 patent recite generic and well-known features of touchscreens shared by the prior art touchscreens, such as the ability to display graphical objects, and the ability to generate a signal indicating a location of contact. There are no claims directed to details of touchscreen design, construction or use that can be considered novel or inventive, nor does the specification of the 356 disclose any details of touchscreen design that could be considered to be inventive. 93. I note in this context that no argument for patentability directed to the particular type or design of touchscreen was made by the Applicant during prosecution of the 356 patent to distinguish over the cited prior art. WEST\ EXHIBIT PAGE 38

39 b. The 356 patent made no inventive contribution to methods or devices for displaying graphical objects on a touchscreen. 94. As described above, touchscreen displays were well-known to those of ordinary skill in the art prior to the alleged invention, and were deployed in many forms across a variety of commercial devices, including PDAs, smartphones, and other computing devices. 95. The asserted claims of the 356 patent recite only the basic and wellknown display features of touchscreens shared by the prior art touchscreens, e.g. the basic ability to display graphical objects. There are no claims directed details of touchscreen display component design, construction or use that can be considered novel or inventive, nor does the specification of the 356 disclose any details of display design that could be considered to be inventive. 96. I note in this context that no argument for patentability directed to the particular type or design of touchscreen display was made by the Applicant during prosecution of the 356 patent to distinguish over the cited prior art. c. The 356 patent made no inventive contribution to methods or devices for detecting contact between an objects and displayed graphical objects on a touchscreen. 97. As described above, touchscreens were well-known to those of ordinary skill in the art prior to the alleged invention, and were deployed in many forms across a variety of commercial devices, including PDAs, smartphones, and other computing devices. WEST\ EXHIBIT PAGE 39

40 98. The asserted claims of the 356 patent recite generic and well-known features of touchscreens shared by the prior art touchscreens, such as the ability to display graphical objects, and the ability to generate a signal indicating a location of contact, as well as other characteristics of the contact, such as pressure, or rate of movement. There are no claims directed details of touchscreen design, construction or use that can be considered novel or inventive, nor does the specification of the 356 disclose any details of touchscreen design or signal processing that could be considered to be inventive. 99. Further, as demonstrated by the prior art 867 patent that the 356 patent admits disclosed prior art touchscreens (Ex at 2:17-19, referencing U.S. Patent No. 5,977,867), those of ordinary skill in the art knew how to detect contact between a physical object and displayed graphical objects on a touchscreen using well-known resistive, capacitive, optical, acoustic or electromagnetic sensing means: A touch pad such as a keypad or a touch screen is mounted with at least one vibrator to produce a tactile feedback sensed by the user as the pad is touched with a finger or a pointer... The touch screen may be any of the analog resitive [sic], infrared, accoustic [sic], capacitive or electromagnetic inductive type operated devices. 867 patent at Abstract I note in this context that no argument for patentability directed to the particular type or design of touchscreen sensing capability was made by the WEST\ EXHIBIT PAGE 40

41 Applicant during prosecution of the 356 patent to distinguish over the cited prior art. d. The 356 patent made no inventive contribution to methods or devices for determining an interaction between objects contacting the screen and displayed graphical objects on a touchscreen 101. As described above, touchscreens were well-known to those of ordinary skill in the art prior to the alleged invention, and were deployed in many forms across a variety of commercial devices, including PDAs, smartphones, and other computing devices The asserted claims of the 356 patent recite generic and well-known features of touchscreens shared by the prior art touchscreens, such as the ability to determine an interaction between objects contacting the screen and displayed graphical objects on a touchscreen. This interaction may be nothing more than simply determining whether the point of contact on the touchscreen does or does not overlap the area of the screen on which a graphical object is displayed, detection of applied pressure, or detection of duration or motion of the contact This is one of the most basic functions and uses of a touchscreen such as would be used in prior art PDAs, smartphones, and the like, is described in the prior art references I summarized above, and thus cannot be considered to be novel or inventive Further, as demonstrated by the prior art 867 patent that the 356 WEST\ EXHIBIT PAGE 41

42 patent admits disclosed prior art touchscreens (Ex at 2:17-19, referencing U.S. Patent No. 5,977,867), those of ordinary skill in the art knew how to detect contact between a physical object and displayed graphical objects on a touchscreen using well-known resistive, capacitive, optical, acoustic or electromagnetic sensing means I note in this context that no argument for patentability directed to determining any particular type of interaction was made by the Applicant during prosecution of the 356 patent to distinguish over the cited prior art. e. The 356 patent made no inventive contribution to methods or devices for generating an actuator signal based on an interaction and haptic effect data in a lookup table 106. As I will discuss in detail below in my overview of the prosecution history of the 356 patent, the only point of novelty that the Applicant argued over prior art cited by the examiner was that the claimed method and system generated an actuator signal based at least in part on haptic effect data in a lookup table However, as summarized above, and described further in my detailed analysis of the claims to follow, prior art computing devices providing haptic feedback to a touchscreen such as those disclosed in Tsuji and Fukumoto had anticipated the use of lookup tables to store data used to generate actuator signals. Thus, contrary to assertions by the Applicant during prosecution of the 356 patent, generating an actuator signal based at least in part on haptic effect data in a WEST\ EXHIBIT PAGE 42

43 lookup table cannot be considered novel or inventive, either alone, or in the claimed combination with other recited elements. C. The 356 Patent Prosecution History 108. The 356 patent is a continuation of several prior applications, and further claims priority to two provisional applications. I was therefore asked to review the prosecution history of the 356 patent, as well as the prosecution history of related applications, and to set forth my opinions as to the implications of these prosecutions and remarks made to the examiners in the context of my invalidity analysis In addition, I was asked to review the two provisional applications identified on the cover page of the 356 patent to which priority is claimed, and to render my opinion as to whether the provisional applications provide written description to support the full scope of the issued claims of the 356 patent. I have been advised that if a provisional application does not provide written description of the full scope of the issued claims of the 356 patent, then those claims would not be entitled to the priority based on the effective filing date of the provisional application I will address each of these issues in turn in the sections that follow, beginning with an overview of the prosecution history of the 356 patent. WEST\ EXHIBIT PAGE 43

44 a. Prosecution History of the 356 Patent 111. The application for the 356 patent was filed on January 31, 2012 and assigned serial number 13/362,113 (the 113 application ). The 113 application claimed priority to three earlier non-provisional applications and two provisional applications. Specification as filed on 1/31/12 at [0001] All pending claims of the then-pending application were initially rejected in an office action dated March 7, 2013, on the basis of nonstatutory obviousness-type double patenting as being unpatentable over U.S. Patent No. 8,159,461. 3/7/2013 Rejection at 3. The applicants overcame this initial rejection by filing a terminal disclaimer in an amendment dated July 8, In a subsequent office action dated Sept 17, 2013, the examiner rejected all pending claims as anticipated by the disclosure found in commonlyassigned U.S. Patent Application Publication No. 2008/ to Rosenberg, et al. ( Rosenberg 350 ) The applicants made two general arguments for patentability over Rosenberg 350 in response to this office action. The first argument the applicants made was that Rosenberg 350 was published after the alleged priority date of 356 patent, and therefore was not prior art. The second argument for patentability was based on the alleged lack of disclosure in Rosenberg 350 of an actuator signal that is generated at least in part by haptic effect data in a lookup table. WEST\ EXHIBIT PAGE 44

45 115. With respect to the first argument directed to priority, the applicants argued that Rosenberg 350 was not a 35 U.S.C. 102(b) reference because it was published on March 20, 2008, which was later than the priority claimed by the 133 application. (2/10/14 Amendment at 9.) The applicants admitted that Rosenberg 350 was a continuation of Rosenberg 737, but represented that Rosenberg 737 was first published on November 1, 2011, and that Rosenberg 350 was therefore only available as a reference under 35 U.S.C. 102(e). Id. ( Thus, based on the respective priority claims of the present application and Rosenberg, Applicant respectfully asserts that under the above analysis Rosenberg is only available for use under 35 U.S.C. 102(e). ) Attorneys for Apple have advised me, however, that Rosenberg 737 was first published by WIPO on July 26, 2001 as the priority document for WO01/54109, and is thus also available as a reference under 35 U.S.C. 102(a), (b) and 103. I have therefore utilized Rosenberg 737 in my detailed invalidity analysis in subsequent sections of this report on that basis With respect to the second argument, I note that the applicants did not attempt to argue that the then-pending claims, without amendment, were patentable over the Rosenberg 350 reference. In this regard, it is my opinion that the Rosenberg 350 reference disclosed each and every limitation of the pending claims or at a minimum, rendered all pending claims obvious and the fact that WEST\ EXHIBIT PAGE 45

46 the applicants made no effort to argue the patentability of the then pending claims absent a narrowing amendment supports my opinion Instead, the applicants added a new limitation, amending all pending claims to require that the actuator signal is generated at least in part by haptic effect data in a lookup table. ( the Lookup Table Limitation ) This was done in an amendment dated February 10, 2014, in whichh it was asserted thatt the amended claims were now patentable e becausee Rosenberg 350 didd not disclose the lookup table limitation Specifically, the applicants argued for patentability over Rosenberg 350 as follows: Ex. 1104, 2/10/14 Amendment at 9, emphasis in original The applicants further argued the written description for the lookup table limitation may be found in the specificatio on as filed,, such as in paragraphs [0043] and [0071] as well as in Figures 9 and 10, (id. at 2/ /10/14 Amendment at 8) WEST\ EXHIBIT PAGE 46

47 and also added six new claims directed to various features of the lookup table In response to applicants argument, the examiner issued a Notice of Allowance for all twenty six claims on 3/6/2014, and the 356 patent subsequently issued However, as I will describe in greater detail below, many other references describing touchscreen devices with haptic effects also disclose the lookup table limitation, and thus this added limitation and the claims as a whole cannot be considered novel or inventive. b. The Claims of the 356 Patent are not Entitled to the Effective Filing Date of The First Provisional Application 123. As discussed above, all of the claims of the 356 patent were amended during prosecution to recite the Lookup Table Limitation to distinguish over the Rosenberg 350 reference cited by the examiner. Thus, in order for any claim of the 356 patent to be entitled to claim priority to an earlier-filed provisional application, I have been advised that such an earlier filed application must provide 35 U.S.C. 112 written description support for the full scope of the claims as amended, including the Lookup Table Limitation. I have therefore been asked to review the provisional applications with respect to any disclosure pertaining to this limitation I note in this context that in the remarks accompanying the amendment in which the Lookup Table Limitation was added, Applicant stated WEST\ EXHIBIT PAGE 47

48 that the subject matter of this limitation was disclosed in the 113 application in paragraphs [0043] and [0071] as well as in Figures 9 and 10. (2/10/14 Amendment at 8) Thus, a pertinent question with respect to the provisional applications is whether they include this same type of disclosure directed to the Lookup Table Limitation The 356 patent claims priority to two provisional applications, the first of which is serial number 60/335,493 (the First Provisional ), dated Nov. 1, The First Provisional only names two of the five inventors named on the 356 patent. Ex at 1 (naming only Messrs. Martin and Vassallo). The First Provisional states that an unidentified controller will control the actuator output, but does not provide further detail, and in particular, does not describe the controller as using haptic effect data in a lookup table in performing this function. First Provisional at Notably, the First Provisional lacks any disclosure corresponding to paragraphs [0043], [0071], Figure 9 or Figure 10 of the 113 Application, which the applicants relied on during prosecution for support for the Lookup Table Limitation in the claims of the 356 patent Thus, based on the legal guidelines provided to me by counsel for Apple, along with my review and analysis of the disclosure of the First Provisional, it is my opinion that no claim of the 356 patent is entitled to the effective filing WEST\ EXHIBIT PAGE 48

49 date of the First Provisional. c. The Claims of the 356 Patent are not Entitled to the Effective Filing Date of The Second Provisional Application 128. The second provisional to which the 356 patent claims priority is serial number 60/399,883 (the Second Provisional ), dated July 31, The Second Provisional only names four of the five inventors named on the 356 patent. Second Provisional at 1 (naming only Messrs. Martin, Vassallo, Jasso, and Goldberg). The Second Provisional also states that an unidentified controller will control the actuator output, but does not provide further detail, and in particular, does not describe the controller as using haptic effect data in a lookup table in performing this function. Second Provisional at Notably, the Second Provisional lacks any disclosure corresponding to paragraphs [0043], [0071], Figure 9 or Figure 10 of the 113 Application, which the applicants relied on during prosecution for support for the Lookup Table Limitation in the claims of the 356 patent Thus, the Second Provisional fails to support any claim of the 356 patent for the same reasons as provided above with respect to the First Provisional. D. Person Having Ordinary Skill in the Art 131. It is my opinion that a person of ordinary skill in the art as of the earliest effective filing dates of the 356 patents would be a person with a bachelors degree in computer science, electrical engineering or a comparable field WEST\ EXHIBIT PAGE 49

50 of study, plus approximately two to three years of professional experience with software engineering, haptics programming, human computer interaction, or other relevant industry experience. Additional graduate education could substitute for professional experience and significant experience in the field could substitute for formal education. It is also my opinion that more education could substitute for experience, and that experience, especially when combined with training, could substitute for formal college education. E. Claim Construction 132. I am advised that haptic effect data (claims 1, 10, 12, 20, 22, 25) should be construed to mean data identifying or describing a tactile sensation. This construction has been adopted in the pending ITC action based on Patent Owner s request. Ex. 1103, The ALJ held this construction is consistent with the specification. Id., Because the broadest reasonable construction standard is used in IPRs, I am advised that this phrase should be construed at least as broadly here I am advised that lookup table (claims 1, 11, 12, 21, 22, 26) should be construed to mean data structure in the form of a table containing associations between interactions and haptic effect data. Ex. 1103, This construction has been adopted in the pending ITC action. The ALJ held this construction is consistent with the specification. Id., Because the broadest reasonable WEST\ EXHIBIT PAGE 50

51 construction standard is used in IPRs, I am informed that this phrase should be construed at least as broadly here. I note that Patent Owner proposed an even broaderr construction, which omitted the phrase in the form of a table. Id., Standard 101-key keyboard (claims 8, 18) should be construed to mean the IBM Model-M keyboard. This claim phrase does not appear in the specification. A person of ordinary skill in the art would understand a standard 101-key keyboard to refer to the IBM Model-MM keyboard, which was introduced in 1986, featured 101 keys, and became the standard for PC keyboards for many years. Exs. 1113; 1114 (reproduced below). F. The 356 Patent Claims 135. For reference, the claims of the 3566 patent are recreated below. 1.pre. 1.a. WEST\ Claim Language A method, comprising: outputting a display signal configured to display a graphical object on 49 EXHIBIT PAGE 51

52 a touch-sensitive input device; 1.b. receiving a sensor signal from the touch-sensitive input device, the sensor signal indicating an object contacting the touch-sensitive input device; 1.c. determining an interaction between the object contacting the touchsensitive input device and the graphical object; 1.d. and generating an actuator signal based at least in part on the interaction and haptic effect data in a lookup table. 2. The method of claim 1, wherein the actuator signal is configured to cause a haptic effect to be output. 3. The method of claim 1, wherein the actuator signal is generated when the object contacts the touch-sensitive device at a location corresponding to the graphical object. 4 The method of claim 1, wherein the actuator signal is generated when the object contacts the touch-sensitive device at a location not corresponding to the graphical object. 5. The method of claim 1, wherein the display signal is configured to display a keypad comprising a plurality of softkeys. 6. The method of claim 5, wherein the haptic effect is caused to be output when a user contacts the touch-sensitive device at a location corresponding to a softkey in a home position. 7. The method of claim 5, wherein the plurality of softkeys comprises one softkey for each digit from 0 to The method of claim 5, wherein the plurality of softkeys comprises the key configuration of a standard 101-key keyboard. 9. The method of claim 1, wherein the graphical object comprises a first graphical object and a second graphical object, the haptic effect comprises a first haptic effect and a second haptic effect, and wherein the first haptic effect is configured to be output when the object contacts the first graphical object, and the second haptic effect is configured to be output when the object contacts the second graphical object. 10. The method of claim 1, wherein the haptic effect data comprises a plurality of haptic effects. 11. The method of claim 1, wherein the lookup table comprises one or more of input device data, position data, pressure data, or function data. 12.pre. A system, comprising: 12.a. a touch sensitive input device configured to output a sensor signal WEST\ EXHIBIT PAGE 52

53 indicating an object contacting the touch-sensitive input device; 12.b. an actuator coupled to the touch-sensitive input device, the actuator configured to receive an actuator signal and output a haptic effect to the touch-sensitive surface basted at least in part on the actuator signal; and 12.c. a processor in communication with the sensor and the actuator, the processor configured to: 12.d. output a display signal configured to display a graphical object on the touch-sensitive input device; 12.e. receive the sensor signal from the touch-sensitive input device; 12.f. determine an interaction between the object contacting the touchsensitive surface and the graphical object; 12.g. generate the actuator signal based at least in part on the interaction and haptic effect data in a lookup table; and 12.h. transmit the actuator signal to the actuator. 13. The system of claim 12, wherein the processor is configured to generate the actuator signal when the object contacts the touchsensitive input device at a location corresponding to the graphical object. 14. The system of claim 12, wherein the processor is configured to output the actuator signal when the object contacts the touch-sensitive device at a location not corresponding to the graphical object. 15. The system of claim 12, wherein the display signal is configured to display a keypad comprising a plurality of softkeys. 16. The system of claim 15, wherein the haptic effect is caused to be output when a user contacts the touch-sensitive device at a location corresponding to a softkey in a home position. 17. The system of claim 15, wherein the plurality of softkeys comprises one softkey for each digit from 0 to The system of claim 15, wherein the plurality of softkeys comprises the key configuration of a standard 101-key keyboard. 19. The system of claim 12, wherein the graphical object comprises a first graphical object and a second graphical object, the haptic effect comprises a first haptic effect and a second haptic effect, and wherein the first haptic effect is configured to be output when the object contacts the first graphical object, and the second haptic effect is configured to be output when the object contacts the second graphical object. 20. The system of claim 12, wherein the haptic effect data comprises a WEST\ EXHIBIT PAGE 53

54 plurality of haptic effects. 21. The system of claim 12, wherein the lookup table comprises one or more of input device data, position data, pressure data, or function data. 22.pre. A computer-readable medium comprising program code, comprising: 22.a. program code for outputting a display signal configured to display a graphical object on a touch-sensitive input device; 22.b. program code for receiving a sensor signal from the touch-sensitive input device, the sensor signal indicating an object contacting the touch-sensitive input device; 22.c. program code for determining an interaction between the object contacting the touch-sensitive input device and the graphical object; 22.d. and program code for generating an actuator signal based at least in part on the interaction and haptic effect data in a lookup table, the actuator signal configured to cause a haptic effect to be output. 23. The computer-readable medium of claim 22, wherein the actuator signal is generated when the object contacts the touch-sensitive device at a location corresponding to the graphical object. 24. The computer-readable medium of claim 22, wherein the actuator signal is generated when the object contacts the touch-sensitive device at a location not corresponding to the graphical object. 25. The computer-readable medium of claim 22, wherein the haptic effect data comprises a plurality of haptic effects. 26. The computer-readable medium of claim 22, wherein the lookup table comprises one or more of input device data, position data, pressure data, or function data. IV. THE PRIOR ART A. Fukumoto 136. WO 2002/12991 A1 ( Fukumoto WO ) was published in Japanese on February 14, 2002 (Ex. 1107, certified translation Ex. 1108). It was subsequently published on October 17, 2002 in English as the corresponding U.S. national stage application, U.S. Pat. App. Pub. No. 2002/ ( Fukumoto US ) (Ex. 1109). WEST\ EXHIBIT PAGE 54

55 The Fukumoto publications are substantively identical. Citations to Fukumoto herein are in parallel to the certified translation of the Fukumoto WO publication and the Ex Application Publication, e.g., Ex. 1 [page]; Ex. [0002] [paragraph]. I understand from Apple counsel that both Fukumoto publications qualify as printed publications under 35 U.S.C. 102(a) (pre-aia) because they were published before the effective filing date of the 356 patent (November 1, 2001). I have been advised that the Patent Owner has never attempted to corroborate a date of conception of the lookup tables claimed in all claims of the 356 patent prior to November 1, Fukumoto is titled, Electronic apparatus vibration generator, vibratory informing method and method for controlling information. Ex at Cover; Ex at Cover. Fukumoto is directed to devices and processes for providing vibratory feedback to input devices or electronic devices. Ex at Abstract; Ex. 1109, Abstract. The devices can include PDAs (personal digital assistants), personal computers, mobile phones, electronic notebooks, mobile computers, wristwatches, electronic calculators, remote controller of an electronic device, and other various types of portable electronic devices. The vibration is generated by an actuator which can be a piezoelectric element (Ex at 48; Ex at [0416]), a permanent magnet as a movable weight (Ex at 48; Ex at [0417]), an electrostatic type oscillatory actuator (Ex at 48; Ex WEST\ EXHIBIT PAGE 55

56 at [0418]), or other suitable oscillatory actuators Fukumoto discloses that a central processing unit (CPU) executes vibration control procedures upon detecting an operation input on a touch panel. Ex at 2; Ex. 1109at [0149]. The CPU drives an oscillatory actuator via a drive signal generation circuit to cause the touch panel of other keys to vibrate. Id Fukumoto discloses the use of lookup tables to store waveform data corresponding to input touch buttons. In one embodiment, for each touch button displayed, the lookup table contains area or position data for the button as well as pressure-levels to determine which waveform signal to send to the actuator. Ex at [0103], [0288], [0289], et al. See also corresponding disclosures in Ex WEST\ EXHIBIT PAGE 56

57 Ex & Ex at Fig As I will describe in detail below, the lookup table disclosed in Fukumoto Figure 46 and accompanying text is substantially similar to the lookup table disclosed in Figure 9 of the 356 patent with respect to the features recited by claim 1. The lookup tables disclosed in Fukumoto Figures 33 (button, position, haptic effect data), 40 (parameter and haptic effect data) and 42 (function data and haptic effect data) and accompanying text similarly disclose other features derived by the inventors and claimed in the 356 patent, particularly in combination with touchscreen technology As I will further describe below, each of the lookup tables shown in Figures 46 and 40 of Fukumoto independently anticipate the Lookup Table Limitation of all claims of the 356 patent. Moreover, as described further below, simply combining the lookup tables shown in Figures 46 and 40 of Fukumoto results in a composite lookup table that is substantively identical in form and content to that of the preferred embodiment of the lookup table depicted in Figure 9 of the 356 patent. Thus, with respect to the very limitation that purportedly distinguished the 356 patent claims over the prior art i.e., the Lookup Table Limitation Fukumoto s disclosure not only anticipates the claim limitation, but is closely mirrored by the specific embodiment of the lookup table shown in Figure 9 of the 356 patent. WEST\ EXHIBIT PAGE 57

58 142. In the sections that follow below, I will present an element-byelement comparison of the asserted claims of the 356 patent to the disclosures found in Fukumoto. As I will demonstrate, each and every limitation of all of the asserted claims are disclosed in Fukumoto, either expressly, inherently, or rendered obvious. It is therefore my opinion that there is very substantial similarity between the disclosures in Fukumoto, and the asserted claims of the 356 patent In the sections presented below, I will present my detailed comparison of the asserted claims of the 356 patent to the disclosures of Fukumoto that demonstrates the substantial similarities between them, beginning with claim 12 of the 356 patent. Limitation 12.pre.: a system, comprising: 144. To the extent that the preamble is limiting, Fukumoto discloses a system. In general terms, essentially any device comprising interconnected or otherwise intercommunicating functional elements or blocks would represent a system, and Fukumoto discloses various embodiments of such systems More particularly with respect to the devices in Fukumoto, Fukumoto discloses a range of systems comprising an electronic device that provides a screen, an input device, and haptic feedback in the form of vibrations to the user of the device: An object of the present invention is to provide an electronic device, a vibration generator, a vibration-type reporting method, and a report WEST\ EXHIBIT PAGE 58

59 control method enabling a user to easily confirm without viewing a screen receipt of an input operation or a response of the electronic device with respect to an operation input. Ex at 1;Ex at [0006] Fukumoto further discloses that the electronic device may comprise a handheld device such as a PDA, as is depicted in Figure 1: Ex & Ex at Figure In Figure 1 above, PDA 10 includes a touch-sensitive input device having a touchscreen comprising transparent touch panel 102 overlaid on liquid crystal display panel 103a. Ex at 13; Ex at [0146]. The PDA further includes push-button operation keys 104a-104c, as shown. Id In addition to the PDA embodiment, Fukumoto further discloses that the disclosed invention may be utilized as a touchscreen for a mobile phone used in cellular networks or a personal handyphone system: WEST\ EXHIBIT PAGE 59

60 Further, the aspect of the invention according to the present embodiment can of course also be applied to a mobile phone serviced by a PDC (personal digital cellular) type mobile packet communication network or a PHS (personal handyphone system (registered trademark) terminal. Ex at 36; Ex at [0325] Other embodiments of devices disclosed in Fukumoto include ATMs, notebook and mobile computers, wristwatches, and other portable devices: In the first embodiment to the 12th embodiments, the explanation was made of the case of application of the present invention to a PDA or an ATM. The present invention however of course may also be applied to for example a mobile phone, electronic notebook, mobile computer, wristwatch, electronic calculator, remote controller of an electronic device, and other various types of portable electronic devices. Further, the present invention may also be applied to a stationary type computer or a vending machine, cash register, car navigation system, household electric appliance, or other of various types of electronic devices not having portability. Ex at 50; Ex at [0430], (emphasis added) While Fukumoto provides numerical designations for embodiments, Fukumoto discloses that all relevant embodiments are based on the same first embodiment of PDA 10 and the components contained therein as disclosed in Figures 1-5. Ex at Figs. 1-5, 32-48, [0146]-[0181], [0259]-[0293]. Ex at Figs. 1-5, 32-48, 13-18, Thus Fukumoto expressly discloses to one of ordinary skill in the art that the features of these embodiments are combined in his descriptions, and are combinable, and a person of ordinary skill in the art would understand that they combinable. WEST\ EXHIBIT PAGE 60

61 Limitation 12.a: a touch sensitive input device configured to output a sensor signal indicating an object contacting the touch-sensitive input device 151. Fukumoto discloses a touch sensitive input device configured to output a sensor signal indicating an object contacting the touch-sensitive input device. I note that this element merely requires outputting a sensor signal indicating contact with the touch sensitive input device, and places no further limitations on the nature of that signal As I discussed above with respect to the preamble, which I incorporate herein by reference, the electronic device disclosed in Fukumoto in a variety of forms utilizes a touchscreen comprising an overlaid transparent touch panel and LCD display One of the embodiments of the system disclosed in Fukumoto comprises a PDA having a touchscreen, as shown in Figure 1, which is reproduced below: WEST\ EXHIBIT PAGE 61

62 Ex & Ex at Figure Fukumoto describes PDA 10 shown in the figure as including transparent touch panel 102 overlaid on the liquid crystal screen 103a, thereby forming a touch-sensitive input device comprising a touchscreen: FIG. 1 is a perspective view illustrating the appearance of a PDA 10 according to a first embodiment of the present invention. In the figure, a transparent touch panel 102 is overlaid on a display screen of a liquid crystal display panel 103a covering an opening of a main case 101. A user inputs operation instructions to the PDA 10 by touching the touch panel 102 by his or her fingertip. Ex at 13;Ex at [0146] Figure 2 of Fukumoto is a block diagram depicting the hardware configuration of PDA 10 that is shown in Figure 1, and is reproduced below: WEST\ EXHIBIT PAGE 62

63 Ex & Ex at Figure Fukumoto describes how the touch-sensitive input device comprising touch panel 102 outputs a sensor signal indicating contact from an object, which Fukumoto calls a touch signal: The touch panel 102 outputs a signal showing a touched position on the touch panel 102 (hereinafter called a touch signal ) to the CPU 113 in response to a touch operation. Ex at 13; Ex at [0148], (emphasis added) He further explains that the CPU 113 shown in Figure 2 above detects the touch signal: As shown in the figure, first, the CPU 113 determines whether a touch signal has been input from the touch panel 102 and whether a key operation signal has been input from the key input unit 111 (step S101). Ex at 16; Ex at [0168], (emphasis added). WEST\ EXHIBIT PAGE 63

64 158. As further evidence supporting my opinion that Fukumoto discloses this limitation, see also: Ex at Figs and accompanying text, [0169], [ ], [0237], [0264]; [0287] (touch signal indicates pressure), [0289], [0302], [0305], [0322], [0343], [0348], [0369]; Figs. 2-3, 12, 16, 19-31, Fukumoto WO at 1 (e.g., touch operation performed by a fingertip or accessory pen ), 13 (e.g., touch signal from touch panel of PDA 10 ), 16 (e.g., Incidentally, even if the CPU 113 determines that a touch signal has been input from the touch panel 102 in step S101, the process of step S102 will not be performed and the vibration control process 1 will be terminated if the touch position on the touch panel 102 based on this signal is outside of the display region of a touch button that is displayed on the display screen, for example. ), 17, 24-25, 28, 31, 33, 39, 41; Figs. 1-3, 12, 16, 19-31, Thus, as shown above, Fukumoto discloses a touch sensitive input device configured to output a sensor signal indicating an object contacting the touch-sensitive input device, as claimed in the 356 patent. Limitation 12.b: an actuator coupled to the touch-sensitive input device, the actuator configured to receive an actuator signal and output a haptic effect to the touch-sensitive surface based at least in part on the actuator signal; 160. As I will describe below, Fukumoto discloses an actuator coupled to the touch-sensitive input device, the actuator configured to receive an actuator signal and output a haptic effect to the touch-sensitive surface basted at least in WEST\ EXHIBIT PAGE 64

65 part on the actuator signal. a. Fukumoto discloses an actuator coupled to the touchsensitive input device: 161. In the PDA embodiment shown in Figure 1 above, Fukumoto discloses an oscillatory actuator 115 that is coupled to touchscreen in each of the embodiments disclosed. The oscillatory actuator 115 is shown, for example, in Figure 2 depicting the hardware configuration of the first PDA embodiment, and shown with accompanying description below: Ex & Ex at Figure 2. Next, FIG. 2 is a block diagram illustrating the hardware configuration of the PDA 10 shown in FIG. 1. As shown in this figure, the PDA 10 has a touch panel 102, a display unit 103, a key input unit 111, a memory 112, a CPU (central processing unit) 113, a drive signal generation circuit 114, and an oscillatory actuator 115. Ex at 13; Ex at [0147], (emphasis added). WEST\ EXHIBIT PAGE 65

66 162. Figure 3 of Fukumoto illustrates thee location of oscillatory actuator 115 within the PDA 10, and is reproduced below: Ex & Ex at Figure As can be observed in this figure, the internal components of actuator 115 is surrounded by a case 115a, and the upper portion of this actuator case is in contact with the liquid crystal display panel 103a, which is in turn directly in contactt with touch panel 102. Thus, Fukumoto discloses an actuator coupled to the touch-sensitive input device comprising touch panel 102. This is described further in the passage below: FIG. 3 is a sectional view schematically illustrating a state of placement of the oscillatory actuator 115 in the main case 101 of the PDA 10. As shown in the figure, the top surface of the case 115a of the oscillatory actuator 115 is in contact with the liquid crystal display WEST\ EXHIBIT PAGE 66

67 panel 103a and operation keys 104a to 104c. Further, the case 115a of the oscillatory actuator 115 is provided inside it with a cylindrical coil 121 fixed to the top surface of the case 115a, a columnar movable weight 122 made of permanent magnet and having an annular space in which the coil 121 fits, and a spring 123 for supporting the movable weight 122. Ex at 14; Ex at [0152], (emphasis added). b. Fukumoto discloses the actuator configured to receive an actuator signal and output a haptic effect to the touchsensitive surface based at least in part on the actuator signal 164. As shown in Figure 2 below, the oscillatory actuator 115 is configured to receive an actuator signal delivered via drive signal generation circuit 114: Ex & Ex at Figure Fukumoto describes how this actuator signal causes the oscillatory actuator 115 to output a haptic effect comprising a vibration to the touch-sensitive surface based at least in part on the actuator signal. This is disclosed, for example, WEST\ EXHIBIT PAGE 67

68 in the following passages: The CPU 113 executes a program stored in the memory 112 to control the parts of the device interconnected through a bus 116. This CPU 113 executes a vibration control processing 1 (see FIG. 5). Upon detection of an operation input from the touch panel 102 or any one of operation keys 104a to 104c, it drives the oscillatory actuator 115 through the drive signal generation circuit 114 to cause the touch panel 102 or one of the operation keys 104a to 104c to vibrate. Ex at 13-14; Ex at [0149], (emphasis added). Next, the CPU 113 outputs the waveform data read from the memory 112 to the drive signal generation circuit 114. At the same time, the CPU 113 instructs the drive signal generation circuit 114 to generate a drive signal (step S103). In response to the processing of step S103, the drive signal generation circuit 114 generates a drive signal using the waveform data supplied from the CPU 113. Ex at 16-17; Ex at [0170] (emphasis added). Further, the present invention provides an electronic device provided with an operating unit for receiving an operation input, a vibration generator able to give vibration to a user and simultaneously cause the generation of sound, and drive control means for combining a drive signal for driving the vibration generator to cause generation of vibration and an audio signal for driving the vibration generator to cause generation of sound and applying the combined signal to the vibration generator in the case of causing generation of vibration and sound from the vibration generator in the case of detecting that an operation input to the operating unit has been received. Ex at 8; Ex at [0045], (emphasis added) The haptic effect that is output to the surface of the touch-sensitive input device as a vibration that is orthogonal to the surface. This is depicted graphically in Figure 3 in the region of contacting of the finger: WEST\ EXHIBIT PAGE 68

69 Ex & Ex at Figure 3, showing hapticc effect at contacting finger Fukumoto explains that by couplingg the actuator 115 to the touch- the haptic effect to be outputt at the surface of the device: sensitive input device in this way, the received actuator signal causess Therefore, by building the oscillatory actuator 115 into the PDA 10 so that the direction of vibration becomes perpendicular to the front surface of the touch panel 102 or the direction of depression of the operation keys 104a to 104c, it is possible to give the user by vibrational stimulus the feeling of pressingg the touch buttons or operation keys when touching the touch panel 102 or when depressing thin operation keys 104a to 104c. Ex at 17; Ex at [0175] As further evidence supporting my opinion that Fukumoto discloses this limitation, seee also: Ex at [0050], [0052], [0148], [ ], [ ], [0168], [ ], [0173], [ ], [0196], [0198], [0204], [0205], [0210], WEST\ EXHIBIT PAGE 69

70 [0213], [0225], [0227], [0229], [0248], [ ], [0271], [0279], [ ], [0289], [ ], [0303], [ ], [0312], [0332], [0338], [ ], [ ], [ ], [0358], [0366], [0370], [0377], [ ], [0383], [0386], [ ], [ ], [0406], [0414], [0420], [0424]; Figs. 5-6, 11, 13, 14-15, 18, 47, 48, 52, 57-58, 63, 82-83; claims 3, 11, 20, 48, 59, 69, 79, 91, 96-98, 99, 129, 131; Ex at 8, 13, 14, 16-24, 26-49, Figs. 5-6, 11, 13, 14-15, 18, 47, 48, 52, 57-58, 63, Thus, as shown above, Fukumoto discloses the actuator configured to receive an actuator signal and output a haptic effect to the touch-sensitive surface based at least in part on the actuator signal. Limitation 12.c: a processor in communication with the sensor and the actuator, the processor configured to: the actuator Fukumoto discloses a processor in communication with the sensor and 171. For example, Figure 2 shows the hardware configuration of PDA 10 of Figure 1, which includes CPU 113 comprising a processor in electrical communication with both the sensor, comprising touch panel 102, and the oscillatory actuator 115. Figure 2, along with accompanying description, is shown below: WEST\ EXHIBIT PAGE 70

71 Ex & Ex at Figure 2. Next, FIG. 2 is a block diagram illustrating the hardware configuration of the PDA 10 shown in FIG. 1. As shown in this figure, the PDA 10 has a touch panel 102, a display unit 103, a key input unit 111, a memory 112, a CPU (central processing unit) 113, a drive signal generation circuit 114, and an oscillatory actuator 115. Ex at 13; Ex at [0147], (emphasis added) Fukumoto explains that the processor comprising CPU 113 is in electrical communication with the sensor comprising touch panel 102, from which it receives sensor signals. CPU 113 is also in electrical communication with the oscillatory actuator 115, to which it sends drive signals: The CPU 113 executes a program stored in the memory 112 to control the parts of the device interconnected through a bus 116. This CPU 113 executes a vibration control processing 1 (see FIG. 5). Upon detection of an operation input from the touch panel 102 or any one of operation keys 104a to 104c, it drives the oscillatory actuator 115 through the drive signal generation circuit 114 to cause the touch panel 102 or one of the operation keys 104a to 104c to vibrate. WEST\ EXHIBIT PAGE 71

72 Ex at 13-14; Ex at [0149], (emphasis added) As explained by Fukumoto, the haptic effect may be different based on the graphical object contacted by the user, or based on the interaction with the graphical object, such as the position of the knob. Ex & Ex at Figs. 32, 33, 42-48; Ex at 28-32; Ex at [0259]-[0293]. These interactions are associated with haptic effects in a lookup table. Id. The signals are received by the CPU, the CPU detects the interaction, and the CPU executing programmable code from memory generates the actuator signal based on the detected interaction, and the haptic effect data comprising waveform data in the lookup table. Id.; Ex & Ex at Figs. 1-2, 5-6, 13-14, 17-18, and accompanying text As further evidence supporting my opinion that Fukumoto discloses this limitation, see also: Ex at [ ], [ ], [ ], [0190], [0195], [ ], [ ], [ ], [ ], [ ], [ ], [ ], [ ], [ ], [ ], [0283], [0287], [0289], [0292], [ ], [ ], [0312], [0314], [0317], [ ], [ ], [ ], [ ], [ ], [0366], [ ], [0405], [0408]; Figs. 6, 13, 47, 52. As further evidence supporting my opinion that Fukumoto discloses this limitation, see also: Fukumoto WO at 13 (e.g., CPU 113 receives touch signal from touch panel 102 ), (e.g., CPU 113 drives the vibration actuator 115 ), WEST\ EXHIBIT PAGE 72

73 14, 16-22, 24, 25, 28-37, 46, 47; Figs. 2, 6, 13, 47, 52. Limitation 12.d: output a display signal configured to display a graphical object on the touch-sensitive input device; 175. Fukumoto discloses the processor outputs a display signal configured to display a graphical object on the touch-sensitive input device As shown in Figure 2, CPU 113 is in electrical communication with display unit 103 (liquid crystal display panel), and provides signals to display unit 103: Ex & Ex at Figure 2. Next, FIG. 2 is a block diagram illustrating the hardware configuration of the PDA 10 shown in FIG. 1. As shown in this figure, the PDA 10 has a touch panel 102, a display unit 103, a key input unit 111, a memory 112, a CPU (central processing unit) 113, a drive signal generation circuit 114, and an oscillatory actuator 115. Ex at 13; Ex at [0147] Fukumoto discloses that the display signal output from CPU 113 is WEST\ EXHIBIT PAGE 73

74 configured to display graphical objects on the touch-sensitive input device, such as softkeys that he refers to as touch buttons: Note that even when the CPU 113 determines that a touch signal has been input from the touch panel 102 at step S101, when it detects that a touched position of the touch panel 102 based on the signal falls outside the display areas of the touch buttons displayed on the display screen, the routine does not proceed to the processing of step S102, the vibration control processing 1 ends. Ex at 16; Ex at [0169], (emphasis added) Examples of graphical objects displayed on the screen of PDA 10 are shown, for example, in Figures 32, and 41 of Fukumoto. For example, Figure 32 shows several graphical objects comprising touch buttons or softkeys being displayed on the screen of the PDA device: Ex & Ex at Figure Fukumoto identifies the graphical objects labeled A-G and displayed WEST\ EXHIBIT PAGE 74

75 on the touch-sensitive device 102 as touch buttons, which a person of ordinary skill in the art would understand to be synonymous with softkeys, the term thatt is used in the 356 patent: FIG. 32 is a view of an examplee of the screen display of the PDA 10 according to a first example of present embodiment. As shown in the figure, the display screen of the PDA 10 shows a plurality of touch buttons A to G. The touch panel 102 overlaid on the display screen detects a touch operation when a user touches a displayed touch button by his or her fingertip. Note that the letters assigned to the touch buttons are only given for identifying the touch buttons. Ex at 28; Ex at [0261], (emphasis added) Another example of graphical iconss being displayed on the PDA of Fukumoto are shown in Figures 38-39, which depict folderr and trash icons, familiar to most users of computing devices. These figures are reproduced below: 181. As explained by Fukumoto, these figures depict a drag and drop operation, as would be used when deleting the folder by dropping it in the trash WEST\ EXHIBIT PAGE 75

76 bin: FIG.38 and FIG.39 are views illustrating the state where a user is dragging an icon displayed on the display screen of the PDA 10 by a touch operation on the touch panel 102 to transfer it to the trash. Note that the trash spoken of here is an icon for nstructing deletion of data. Ex at 29; Ex at [0270], (emphasis added) As a final example of a graphical object being displayedd on the touch- that can be dragged along the scale as shown below: sensitive input device, Figure 41 depicts a simulated slider control using a knob Ex & Fukumoto at Figure Fukumoto describes how the graphical objects shown above can simulate, for example, a sliding control button for adjusting a parameter such as audio volume, or screen brightness: WEST\ EXHIBIT PAGE 76

77 FIG. 41 is a view of an example of the screen display of a PDA 10 according to a third example of this embodiment. As shown in the figure, the display screen of the PDA 10 displays a scale and knob for adjusting the value of a parameter such as the level of sound of the PDA 10 or the brightness of the screen. The user can drag and change the position of the knob displayed on the screen by a touch operation on the touch panel 102. Ex at 30; Fukumoto at [0277], (emphasis added) As further evidence supporting my opinion that Fukumoto discloses this limitation, see also: Ex at [0179], [0199], [0232], [ ], [ ], [ ], [0348], [0357]; Figs. 6, 13, 17, 38-39, 41, 47, 52, As further evidence supporting my opinion that Fukumoto discloses this limitation, see also: Fukumoto PCT at 13 (e.g., display panel 103 ), 14-16, 18 (e.g., touch buttons ), 19, 23-25, 27, 28 (e.g., plurality of touch buttons A to G that are displayed on the screen ), 33, 34, 39, 46; Figs. 6, 13, 17, 32, 38-39, 41, 47, 52, As shown above, Fukumoto discloses a processor configured to output a display signal configured to display a graphical object on the touch-sensitive input device. Limitation 12.e: receive the sensor signal from the touch-sensitive input device; 187. Fukumoto discloses that the processor receives the sensor signal from the touch-sensitive input device As I discussed above with respect to the preamble, which I WEST\ EXHIBIT PAGE 77

78 incorporate here by reference, the electronic device disclosed in Fukumoto in a variety of forms utilizes a touchscreen comprising an overlaid transparent touch panel and LCD display One of the embodiments of the system disclosed in Fukumoto comprises a PDA having a touchscreen, as shown in Figure 1, which is reproduced below: Ex & Ex at Figure Fukumoto describes PDA 10 shown in Figure 1 as including transparent touch panel 102 overlaid on the liquid crystal screen 103a, thereby forming a touch-sensitive input device comprising a touchscreen: FIG. 1 is a perspective view illustrating the appearance of a PDA 10 according to a first embodiment of the present invention. In the figure, a transparent touch panel 102 is overlaid on a display screen of a liquid crystal display panel 103a covering an opening of a main case WEST\ EXHIBIT PAGE 78

79 101. A user inputs operation instructions to the PDA 10 by touching the touch panel 102 by his or her fingertip. Ex at 13; Ex at [0146] Figure 2 of Fukumoto is a block diagram depicting the hardware configuration of PDA 10 that is shown in Figure 1, and is reproduced below: Ex & Ex at Figure As shown above, CPU 113 is in electrical communication with touch panel 102, from which it receives signal responsive to contact, or touch. Fukumoto describes how the touch-sensitive input device comprising touch panel 102 outputs a sensor signal indicating contact from an object, which Fukumoto calls a touch signal, and further discloses that this signal is received by the CPU 113: The touch panel 102 outputs a signal showing a touched position on the touch panel 102 (hereinafter called a touch signal ) to the CPU 113 in response to a touch operation. WEST\ EXHIBIT PAGE 79

80 Ex at 13; Ex at [0148], (emphasis added) He further explains that the CPU 113 shown in Figure 2 above detects the touch signal as an input, meaning that it receives this signal: As shown in the figure, first, the CPU 113 determines whether a touch signal has been input from the touch panel 102 and whether a key operation signal has been input from the key input unit 111 (step S101). Ex at 16; Ex at [0168], (emphasis added) As further evidence supporting my opinion that Fukumoto discloses this limitation, see also: Fukumoto WO at 1 (e.g., touch operation performed by a fingertip or accessory pen ), 13 (e.g., touch signal from touch panel of PDA 10 ), 16 (e.g., Incidentally, even if the CPU 113 determines that a touch signal has been input from the touch panel 102 in step S101, the process of step S102 will not be performed and the vibration control process 1 will be terminated if the touch position on the touch panel 102 based on this signal is outside of the display region of a touch button that is displayed on the display screen, for example. ), 17, 24-25, 28, 31, 33, 39, 41; Figs. 2-3, 12, 16, 19-31, Thus, as shown above, Fukumoto discloses a processor configured to receive the sensor signal from the touch-sensitive input device, as claimed in the 356 patent. Limitation 12.f: determine an interaction between the object contacting the touch-sensitive surface and the graphical object; WEST\ EXHIBIT PAGE 80

81 196. Fukumoto discloses the processor determines an interaction between the object contacting the touch-sensitive surface and the graphical object A person of ordinary skill in the art would understand the claimed interaction between the object contacting the touch-sensitive surface and the graphical object to broadly encompass determination of a wide variety of possible modes of interaction. These would include at least those based on the relative position of the contacting object with respect to that of the graphical object on the display surface, and/or the level of force or pressure associated with the area of contact, among others As described above with respect to claim elements 12c and 12d, which discussion I incorporate here by reference, a person of ordinary skill in the art would understand that CPU 113 as shown in Figure 2 is a processor As I will describe below, CPU 113 performs processing which determines both the relative position of the contacting object with respect to that of a displayed graphical object, as well as a determination of the level of pressure associated with the area of contact. It further classifies the level of pressure into categories based on pressure thresholds, such that this additional information pertaining to pressure levels may be used for determining which haptic effects are to be output to the surface in response to the contact. It further identifies functions associated with the graphical object being contacted, and movement interactions, WEST\ EXHIBIT PAGE 81

82 such as drag operations. I will address each of these functions in turn, below. i. Fukumoto discloses determining a positional interaction between the object contacting the touchsensitive surface and the graphical object 200. With respect to determining a positional interaction, Fukumoto discloses that CPU 113 determines a positional interaction between the object contacting the touch panel 102 and a graphical object The first step of this process is determining the location on the surface where an object such as a finger contacts the touch-sensitive surface. Fukumoto discloses that the touch panel 102 outputs a touch signal which indicates the position of contact of the object, and that this touch signal is sent to the CPU: The touch panel 102 outputs a signal showing a touched position on the touch panel 102 (hereinafter called a touch signal ) to the CPU 113 in response to a touch operation. Ex at 13; Ex at [0148] Fukumoto explains how the CPU then determines whether the contact position conveyed by this touch signal corresponds to the area on the screen of a displayed graphical object, such as a touch button. Fukumoto describes this operation, for example, with respect to the flow chart shown in Figure 5, which is reproduced below: WEST\ EXHIBIT PAGE 82

83 Ex & Ex at Figure As explained by Fukumoto, the CPU executes the algorithm shown in Figure 5 to determine whether or not the touch signal indicates that object is contacting the touch panel at a position corresponding to the display areas of the touch buttons on the screen: As shown in the figure, first, the CPU 1133 determines whether a touch signal has been input from the touch panell 102 and whether a key operation signal has been input from the key input unit 111 (step S101). WEST\ EXHIBIT PAGE 83

84 Ex at 16; Ex at [0168], (emphasis added). Note that even when the CPU 113 determines that a touch signal has been input from the touch panel 102 at step S101, when it detects that a touched position of the touch panel 102 based on the signal falls outside the display areas of the touch buttons displayed on the display screen, the routine does not proceed to the processing of step S102, the vibration control processing 1 ends. Ex at 16; Ex at [0169] As described above, the CPU 113 determines a positional interaction between the object contacting the surface of the device and the displayed graphical object, such as a touch button. If point of contact falls outside the display areas of the touch buttons displayed on the display screen, the routine does not proceed to the processing of step S102. As shown in Figure 5 above, in this even, the operation ceases If instead the point of contact falls within the display area of one of the touch buttons displayed on the display screen, the routine will proceed to the processing of step S102. As shown in Figure 5 above, in this event, the operation will proceed to read waveform data for the actuator signal at step S102, and use this to output an actuator signal to produce a haptic effect in response Another example of a disclosure in Fukumoto which describes this determination of a positional interaction include the following is made with respect to Figure 32, shown below, which depicts a number of touch buttons labeled A-G: WEST\ EXHIBIT PAGE 84

85 Ex & Ex at Figure 32. FIG. 32 is a view of an example of the screen display of the PDA 10 according to a first example of present embodiment. As shown in the figure, the display screen of the PDA 10 shows a plurality of touch buttons "A" to "G". The touch panel 102 overlaid on the display screen detects a touch operation when a user touches a displayed touch button by his or her fingertip. Note that the letters assigned to the touch buttons are only given for identifying the touch buttons. Ex at 28; Ex at [0261], (emphasis added) Fukumoto explains that it is the CPU 113 that identifies the touched button when a user touches a displayed touch button their fingertip: As explained above, according to the first example of this embodiment, in the case of detecting a touch operation on the touch panel 102, the CPU 113 first detects a touched position and identifies the operated touch button. The CPU 113 then causes vibration to be generated from the oscillatory actuator 115 by a vibration mode linked with the type of the touch button. Ex at 29; Ex at [0266], (emphasis added). WEST\ EXHIBIT PAGE 85

86 208. As a final example of determining a positional interaction, Fukumoto describes determining the positional interaction between the point of contact of a user s finger, and a graphical object in the form of a folder r icon during a drag and drop operation. This is illustrated in Figures 38-39, which depict a folder icon being contacted by a finger (FIG. 38) and subsequently dragged into the trash icon for deletion (FIG. 39). These figures are reproduced below: 209. As explained by Fukumoto, the processor is configured to determine the correspondence of the position of the point of contact with that of the graphical object displayed as a folder icon on the PDA screen. This is detectedd as a touch operation on the icon, whichh selects it for the drag operation, as described below: First, when a user selects the icon desired to be dragged by the touch operation on the touch panel 102, the CPU 113 of the PDA 10 detects the touched position and identifies the touch operation as an instruction for the selection of the icon. The memory 112 of the PDA 10 stores the waveform data table 112b, ass shown in FIG. 40, storing WEST\ EXHIBIT PAGE 86

87 the waveform data of the drive signal to be applied to the oscillatory actuator 115 for each type of instruction designated by an operation input. The CPU 113 reads the waveform data linked with "SELECT ICON" from the waveform data table 112b and drives the oscillatory actuator 115. As a result, the fingertip of the user performing the touch operation or the hand of the user holding the PDA 10 is given vibration indicating that the icon is selected. Ex at 29; Ex at [ ] Thus, for at least the reasons directed to a positional interaction as set forth above, Fukumoto discloses a processor configured to determine an interaction between the object contacting the touch-sensitive surface and the graphical object As I will discuss next, Fukumoto also discloses this limitation in the context of determining a pressure interaction between the object contacting the touch-sensitive surface and a graphical object. ii. Fukumoto discloses determining a pressure interaction between the object contacting the touchsensitive surface and the graphical object 212. When an object such as finger contacts the touch-sensitive surface at a point inside of an area corresponding to a graphical object such as a touch button as described above, the processor comprising CPU 113 additionally determines and classifies the level of pressure being applied to the region of the displayed graphical object. Fukumoto explains that this determination is used to distinguish between a touch operation and a press operation, where the press operation is WEST\ EXHIBIT PAGE 87

88 active to select the function associated with the graphical object This general function is described in the following disclosure: Further, the present invention provides an electronic device provided with an operating unit for receiving an operation input and detecting a level of pressure of the operation input, a vibration generator for imparting vibration to a user, and vibration control means for causing vibration to be generated from the vibration generator by a vibration mode linked with the level of pressure of the operation input detected by the operating unit in the case of detecting that an operation input to the operating unit has been received. Ex at 7; Ex at [0037], (emphasis added). According to the present invention, the electronic device reports to the user that an operation input has been received by a vibration mode in accordance with the level of pressure of the operation input. Ex at 7; Ex at [0038], (emphasis added) Fukumoto describes how the pressure-level determination is used in part to select the particular haptic effect to deliver to the surface of the device with respect to the first PDA embodiment at paragraphs [0286]-[0289], for example: In this embodiment, explanation will be given on an electronic device for reporting to the user that a touch operation has been received by vibration of a mode that differs in accordance with the level of pressure of the touch operation on the touch panel. Note that in this embodiment, the explanation will be given based on the PDA 10 explained in the first embodiment. Ex at 31; Ex at [0286], (emphasis added) Fukumoto explains further that the touch signal output by touch panel 102 not only indicates the position of contact, but further indicates the level of pressure being applied, and thus can be used to differentiate between a touch WEST\ EXHIBIT PAGE 88

89 operation, in which the level of applied pressure is lower, from that of a pressing operation, where the level of applied pressure is higher. 2 This classification of the input operation as either the lower-pressure touch operation or the higherpressure pressing operation is performed by the CPU 113 by comparing the detected pressure to a predetermined level of pressure, serving as a threshold for comparison: In this embodiment, the touch panel of the PDA 10 can detect two operating states, that is, the state where the fingertip of the user is in contact with the touch panel (hereinafter, in this embodiment, called a touch operation ) and the state where the fingertip is pressing the touch panel by a force of more than a predetermined level of pressure (hereinafter, in this embodiment, called a pressing operation ). The type of touch signal output from the touch panel to the CPU 113 differs from the case of a touch operation to the case of a pressing operation. Ex at 31; Ex at [0287], (emphasis added) Further disclosures related to the determination of a pressure-level interaction between the object contacting the surface of the device and a graphical object displayed on its screen are provided below: Further, the memory 112 of the PDA 10 according to the present embodiment stores the waveform data table 112d shown in FIG. 46. The waveform data table 112d corresponds to the example of the screen display of the touch buttons shown in FIG. 32. The waveform data table 112d stores for each touch button the area data for the touch button and the waveform data to be applied to the oscillatory actuator 2 Fukumoto also discloses an alternative embodiment in which a second touch panel is utilized to provide pressure information to the CPU. See, e.g. Fukumoto US at [0293]. WEST\ EXHIBIT PAGE 89

90 115 for each of the cases when the touch button is subjected to a touch operation and when it is subjected to a pressing operation. Ex at 31; Ex at [0288], (emphasis added). In a PDA 10 having this configuration, when a touch operation is performed on the touch panel, the touch panel outputs to the CPU 113 a touch signal showing that a touch operation has been performed. The CPU 113 finds the coordinate data of the touched position based on the touch signal and identifies the touch button operated referring to the waveform data table 112d. Next, the CPU 113 reads the waveform data for the touch operation linked with the identified touch button from the waveform data table 112d. The CPU 113 then drives the oscillatory actuator 115 using the drive signal generated by the read waveform data. The same is true for the case when a pressing operation is performed on the touch panel. The CPU 113 reads the waveform data for the pressing operation linked with the operated touch button from the waveform data table 112d and drives the oscillatory actuator 115. Ex at 31; Ex at [0289], (emphasis added). 81. An electronic device, comprising: an operating unit for receiving an operation input and detecting a level of pressure of said operation input; a vibration generator for generating vibration which is transmitted to a user; and vibration control means for, in a case of detecting that an operation input to said operating unit has been received, causing said vibration generator to generate vibration by a vibration mode linked with a level of pressure of said operation input detected by said operating unit. Ex at claim Thus, for at least the additional reasons directed to a pressure-level interaction as set forth above, Fukumoto discloses a processor configured to WEST\ EXHIBIT PAGE 90

91 determine an interaction between the object contacting the touch-sensitive surface and the graphical object. iii. Fukumoto discloses determining a key-function interaction between the object contacting the touchsensitive surface and the graphical object 218. As final example of determining an interaction, Fukumoto also discloses determining a key-function determination. This is the same type of determination shown in Figure 9 of the 356 patent under the column of the table labeled Function, further demonstrating the similarity between the disclosures of Fukumoto and the specification and claims of the 356 patent Figures and accompanying text disclose a lookup table containing function data (operation instruction), such as select icon, drag, and delete data. This aspect of Fukumoto s system is described with respect to graphical icons being displayed on the PDA of Fukumoto and shown in Figures 38-39, which depict folder and trash icons, familiar to most users of computing devices. These figures are reproduced below: WEST\ EXHIBIT PAGE 91

92 220. As explained by Fukumoto, these figures depict a drag and drop operation, as would be used when deleting the folder by dropping it in the trash bin: FIG.38 and FIG.39 are views illustrating the state where a user is dragging an icon displayed on the display screen of the PDA 10 by a touch operation on the touch panel 102 to transfer it to the trash. Note that the trash spoken of here is an icon for nstructing deletion of data. Ex at 29; Ex at [0270], (emphasis added) Fukumoto describes operation instructions or functions such as select, drag, delete data associated with this action: First, when a user selects the icon desired to be dragged by the touch operation on the touch panel 102, the CPU 113 of the PDA 10 detects the touched position and identifies the touch operation as an instruction for the selection of the icon. The memory 112 of the PDA 10 stores the waveform data table 112b, ass shown in FIG. 40, storing the waveform data of the drive signal to bee applied to the oscillatory WEST\ EXHIBIT PAGE 92

93 actuator 115 for each type of instruction designated by an operation input. Ex at 29; Ex at [0271] The lookup table referenced above with respect to this operation is shown in Figure 40 of Fukumoto, which is reproduced below: Ex & Ex at Figure As can be observed in this figure, a variety of different functional instructions corresponding to different actions are stored in the lookup table, including Enter, Cancel, Click, Drag, Select Icon, etc. I note that these correspond to the functions that are disclosed and claimedd in the 356 patent as being in its lookup table. See, e.g. Ex at Figure 9, FUNCTION column. WEST\ EXHIBIT PAGE 93

94 224. Fukumoto explains that the first step of the drag and drop operation depicted in Figures is to select the folder icon by touching it with a finger. CPU 113 detects this instruction, and lookup table 112b associates this function with a first set of waveform data, which results in a first actuator signal and haptic effect being generated: The CPU 113 reads the waveform data linked with SELECT ICON from the waveform data table 112b and drives the oscillatory actuator 115. As a result, the fingertip of the user performing the touch operation or the hand of the user holding the PDA 10 is given vibration indicating that the icon is selected. Ex at 29; Ex at [0272], (emphasis added) Fukumoto explains that the next step of the drag and drop operation depicted in Figures is to drag the folder icon while maintaining contact with the finger. Lookup table 112b associates this drag function with a different set of waveform data, which results in a different actuator signal and haptic effect being generated: Further, as shown in FIG. 38, when the user moves his or her fingertip while in contact with the touch panel 102 to drag the selected icon, the CPU 113 identifies the touch operation as an instruction for the dragging of the icon. Therefore, the CPU 113 reads the waveform data linked with DRAG from the waveform data table 112b and drives the oscillatory actuator 115. Due to this, vibration showing that a drag operation is under way is transmitted to the user. For example, when a drag operation is under way, it is preferable to continuously give a weak vibration. Ex at 29; Fukumoto at [0273], (emphasis added). WEST\ EXHIBIT PAGE 94

95 226. Fukumoto explains that the final step of the drag and drop operation depicted in Figures is to perform the functional instruction to delete data of the folder icon by dragging it on top of the trash icon. This is shown in Figure 29. CPU 113 detects this operation, and lookup table 112b associates this function with another set of waveform data, which results in yet a different actuator signal and haptic effect being generated: Further, as shown in FIG. 39, when the dragged icon is superposed over the trash, the CPU 113 identifies the touch operation as instruction for the placement of the icon in the trash. Therefore, while the CPU 113 stores the icon in the trash, it reads the waveform data linked with DELETE DATA from the waveform data table 112b and drives the oscillatory actuator 115. As a result, the user performing the touch operation is given vibration indicating deletion of the icon. Ex at 29; Fukumoto at [0274], (emphasis added) As I have established above, Fukumoto discloses that the processor will determine positional, pressure-contact, movement (drag), and key-function interactions between the object contacting the surface and graphical objects displayed on the screen. These same types of interactions are disclosed and claimed in the 356 patent Further, the disclosed lookup tables of Fukumoto also associate sets of haptic effect data with functional operations, such as selecting, detecting data, dragging, etc., as shown in Figure 40. I note that these same types of functions in a lookup table were subsequently disclosed and claimed in the 356 patent. WEST\ EXHIBIT PAGE 95

96 Limitation 12.g: generate the actuator signal based at least in part on the interaction and haptic effect data in a lookup table; and 229. Fukumoto discloses a processor configured to generate the actuator signal based at least in part on the interaction and haptic effect data in a lookup table As I have established above, Fukumoto discloses a processor configured to determine an interaction between the object contacting the touchsensitive surface and the graphical object. In particular, as part of my analyses directed to claim element 12f, which I incorporate here by reference, I showed that Fukumoto discloses determining at least both a positional, pressure-level, movement, function interactions between the object contacting the touch-sensitive surface and the graphical object As I described above, the positional determination of interaction is performed by the CPU 113, which compares the positional coordinates of the point of contact derived from the touch signal output by touch panel 102 to that of the area occupied by the graphical object. Fukumoto explains that this detection of positional correspondence is indicated to the user by one or more haptic effects which are generated in response to this detection As I further described above, the pressure-level determination of interaction is also performed by the CPU 113, which determines the pressure being applied to the surface at the location of the graphical object, and compares this to a WEST\ EXHIBIT PAGE 96

97 predetermined pressure threshold. Fukumoto explains that the touch signal generated by the touch panel 102 indicates not only the position or location of contact, but also indicates level of pressure being applied. The level of pressure is then compared to the predetermined pressure threshold to distinguish between a touch operation (a lower pressure contact) and that of a pressing operation (a higher pressure contact). Fukumoto explains that the detection of these different levels of pressure is indicated to the user by causing the device to respond with distinct haptic effects As I further described above, the movement and function determination of interaction is also performed by the CPU 113, which associates contact with displayed softkeys comprising graphical objects with functions such as select, delete, drag, etc. and thereby determines the function or motion interactions of the graphical object As I will describe in detail below, Fukumoto discloses that the processor will generate the actuator signal based at least in part on the interaction, and further, Fukumoto discloses and/or renders obvious that the processor will generate the actuator signal based at least in part on haptic effect data in a lookup table. I will address each of these in turn, below. WEST\ EXHIBIT PAGE 97

98 i. Fukumoto discloses that the processor is configured to generate the actuator signal based at least in part on the interaction As I showed above with respect to Figure 2 and related disclosure, the piezoelectric oscillatory actuator 115 is configured to receive an actuator signal delivered via drive signal generation circuit 114. Fukumoto describes that this actuator signal is generated at least in part by an interaction, including, for example, the positional, pressure-level, movement, and key-function interactions As explained by Fukumoto, the haptic effect that is generated in response to a touch operation is different based on the graphical object contacted by the user, or based on other interaction with the graphical object, such as the position of the knob or pressure applied. This is shown, for example, in Ex at Figs. 32, 33, 42-48; [0259]-[0293]; Ex at Figs. 32, 33, 42-48; pp These interactions are associated with haptic effects in a lookup table, as I will discuss further below. Id. The signals are received by the CPU, the CPU detects the interaction, and the CPU executing programmable code from memory generates the actuator signal based on the determined interaction, and the haptic effect data comprising waveform data in the lookup table. Id.; Ex & Ex at Figs. 1-2, 5-6, 13-14, 17-18, and accompanying text For example, Fukumoto describes that different vibration modes are generated based upon which touch button was pressed: WEST\ EXHIBIT PAGE 98

99 As explained above, according to the first example of this embodiment, in the case of detecting a touch operation on the touch panel 102, the CPU 113 first detects a touched position and identifies the operated touch button. The CPU 113 then causess vibration to be generated from the oscillatory actuator 1155 by a vibration mode linked with the type of the touch button. Ex at 29; Ex at [0266], (emphasis added) This is further explained with respect to Figure 33, shown below: Ex & Ex at Figure Figure 33 shows a first lookup tablee corresponding to a PDA screen of Figure 32, displaying Touch buttons labeled A, B, C... The lookup table of Figure 33 including entries for individual touch buttons, and their corresponding area or position on the screen: Next, FIG. 33 is a view illustrating a waveform data a table 112a stored in the memory 112 in the PDA 10. As shown in the figure, the waveform data table 112a stores, for each touch button displayed on WEST\ EXHIBIT PAGE 99

100 the screen, area data showing the area occupied by a touch button on the touch panel 102 using XY coordinates as well as waveform data of the drive signal to be applied to the oscillatory actuator 115 when that touch button is pressed. Ex at 31; Fukumoto at [0286], (emphasis added) Thus, as shown above, Fukumoto discloses that the processor is configured to generate the actuator signal based at least in part on a determined positional interaction. Information about the positional interaction is stored in the lookup table, as shown in Figure As a further example, Figure 46 of Fukumoto shows a lookup table corresponding to a PDA screen of Figure 32, displaying Touch buttons labeled A, B, C... The lookup table of Figure 46 includes entries for individual touch buttons, their corresponding area or position on the screen, and pressure-level classifications, i.e. touch operation (lower contact pressure) and pressing operations (higher contact pressure). Figure 46 is reproduced below: WEST\ EXHIBIT PAGE 100

101 Ex & Ex at Figure Fukumoto explains that the different waveform data used to produce the different actuator signals are stored in this table, and are retrieved based in part on the determined interactions, including both the positional and pressure-level interactions: FIG. 46 is a view illustrating a waveform data table stored in a memory in a PDA according to a seventh embodiment of the present invention. Ex at 11; Ex at [0103]. Further, the memory 112 of the PDA 10 according to the present embodiment stores the waveform data table 112d shown in FIG. 46. The waveform data table 112d corresponds to the example of the screen display of the touch buttons shown in FIG. 32. The waveform data table 112d stores for each touch button the area data for the touch button and the waveform data to be applied to the oscillatory actuator 115 for each of the cases when the touch button is subjected to a touch operation and when it is subjected to a pressing operation. WEST\ EXHIBIT PAGE 101

102 Ex at 31; Ex at [0288], (emphasis added). In a PDA 10 having this configuration, when a touch operation is performed on the touch panel, the touch panel outputs to the CPU 113 a touch signal showing that a touch operation has been performed. The CPU 113 finds the coordinate data of the touched position based on the touch signal and identifies the touch button operated referring to the waveform data table 112d. Next, the CPU 113 reads the waveform data for the touch operation linked with the identified touch button from the waveform data table 112d. The CPU 113 then drives the oscillatory actuator 115 using the drive signal generated by the read waveform data. The same is true for the case when a pressing operation is performed on the touch panel. The CPU 113 reads the waveform data for the pressing operation linked with the operated touch button from the waveform data table 112d and drives the oscillatory actuator 115. Ex at 31; Ex at [0289], (emphasis added) Thus, as shown above, Fukumoto discloses that the processor is configured to generate the actuator signal based at least in part on both a determined positional and pressure-level interaction, each of which is reflected by entries in the lookup table shown in Figure A further example is disclosed with respect to the simulated slider control button that may be displayed on the screen of PDA 10, as depicted in Figure 41. Figure 41 depicts a simulated slider control using a knob that can be dragged along the scale as shown below: Figure 42 illustrates a lookup table used to generate the actuator signal in conjunction with this displayed function, and both Figures are reproduced below: WEST\ EXHIBIT PAGE 102

103 Ex & Ex at Figure Fukumoto describes how the graphical objects shown above can simulate, for example, a sliding control button for adjusting a parameter such as audio volume, or screen brightness: FIG. 41 is a view of an examplee of the screen display of a PDA 10 according to a third example of this embodiment. As shown in the figure, the display screen of the PDA 10 displays a scale and knob for adjusting the value of a parameter such as the levell of sound of the PDA 10 or the brightness of the screen.. The user can drag and change the position of the knob displayedd on the screen by a touch operation on the touch panel 102. Ex at 30; Ex at [0277], (emphasis added) Fukumoto discloses that a lookup table is used to associate specified parameter ranges along the sliding scale with different haptic effect, which is used in part to generatee the actuator signal. This lookup table is depicted in Figure 42, WEST\ EXHIBIT PAGE 103

104 which is reproduced below: Ex & Ex at Figure Fukumoto discloses that the particular set of waveform data used to generatee different actuator signals is selected based on the determinedd position of the knob with respect to four parameter value ranges of the sliding scale, as shown above. Thus, this disclosure represents another example off the CPU configured to generatee the actuator signal based at least in partt on the determined interaction, at least a positional interaction n: Next, when the user moves his or her fingertip while in contact with the touch panel 102 and drags the knob along the scale, the CPU 113 recognizes that the knob is being dragged. Here, the memory 112 of the PDA 10 according to the third example off this embodiment stores the waveform data table 112c shownn in FIG. 42. This waveform data table 112c divides the range of valuess which the parameter can take into several sections and stores waveform data of the drive signal to be applied to the oscillatory actuator 115 for each section. Ex at 30; Ex at [0279], (emphasis added). WEST\ EXHIBIT PAGE 104

105 The CPU 113 obtains the value of the parameter in accordance with the position of the dragged knob, reads the waveform data linked with the current value of the parameter from the waveform data table 112c, and drives the oscillatory actuator 115. Therefore, while the knob is being dragged, the fingertip of the user performing the touch operation or the hand of the user holding the PDA 10 is given vibration in accordance with the value of the parameter. Ex at 30; Ex at [0280]. ii. Fukumoto discloses that the processor is configured to generate the actuator signal based at least in part on the interaction stored in a lookup table As I established above, Fukumoto discloses that the various determined interactions are stored in a lookup table, and that the actuator signal is generated from the associated waveform data based in partt on a determined interaction stored in the lookup table These lookup tables are depicted in Figures, 33, 42, 46, and show, for example, positional, pressure-level and key-function interactions stored in thesee lookup tables: Ex & Ex at Figures 33 & 42. WEST\ EXHIBIT PAGE 105

106 Ex & Ex at Figures 40 & Thus, for at least the reasons as set forth above, Fukumoto also discloses under a processor configured to to generate the actuator signal based at least in part on the interaction... stored in a lookup table. iii. Fukumoto discloses and/ /or renders obvious a processor that is configured to generate the actuator signal based at least in part on the interaction and haptic effect data stored in a lookup table As I have established above, Fukumoto discloses a processor configured to determine an interactionn between the objectt contacting the touch- sensitive surface and the graphical object. In particular, as part of my analyses directedd to claim element 12f, which I incorporate here by reference, I showed that Fukumoto discloses determining at least both a positional and a pressure-level WEST\ EXHIBIT PAGE 106

107 interaction between the object contacting the touch-sensitive surface and the graphical object As I described above, the positional determination of interaction is performed by the CPU 113, which compares the positional coordinates of the point of contact derived from the touch signal output by touch panel 102 to that of the area occupied by the graphical object. Fukumoto explains that this detection of positional correspondence is indicated to the user by one or more haptic effects which are generated in response to this detection As I further described above, the pressure-level determination of interaction is also performed by the CPU 113, which determines the pressure being applied to the surface at the location of the graphical object, and compares this to a predetermined pressure threshold. Fukumoto explains that the touch signal generated by the touch panel 102 indicates not only the position of contact, but also indicates level of pressure being applied. The level of pressure is then compared to the predetermined pressure threshold to distinguish between a touch operation (a lower pressure contact) and that of a pressing operation (a higher pressure contact). Fukumoto explains that the detection of these different levels of pressure is indicated to the user by causing the device to respond with distinct haptic effects As I further described above, the movement and key-function determination of interaction is also performed by the CPU 113, which associates WEST\ EXHIBIT PAGE 107

108 contact with displayed softkeys comprising graphical objects with functions such as select, delete, drag, etc. and thereby determines the function or motion interactions of the graphical object As I further established above, Fukumoto discloses that the processor will generate the actuator signal based at least in part on the interaction Below, I will further demonstrate that Fukumoto discloses and/or renders obvious that the processor will generate the actuator signal based at least in part on haptic effect data in a lookup table. Fukumoto discloses that the processor is configured to generate the actuator signal based at least in part on the interaction As I showed above with respect to Figure 2 and related disclosure, the piezoelectric oscillatory actuator 115 is configured to receive an actuator signal delivered via drive signal generation circuit 114. Fukumoto describes that this actuator signal is generated at least in part by retrieving waveform data from a lookup table based on one or more determined interactions As explained by Fukumoto, the haptic effect that is generated in response to a touch operation is different based on the graphical object contacted by the user, or based on other interaction with the graphical object, such as the position of the knob or pressure applied. This is shown, for example, in Ex at Figs. 32, 33, 42-48; [0259]-[0293]; Ex at Figs. 1-5, 32-48, 13-18, These interactions are associated with haptic effects in a lookup table, as I will WEST\ EXHIBIT PAGE 108

109 discuss further below. Id. The signals are received by the CPU, the CPU detects the interaction, and the CPU executing programmable code from memory generates the actuator signal based on the detected interaction and the waveform in the lookup table. Id.; Ex and Ex at Figs. 1-2, 5-6, 13-14, 17-18, and accompanying text For example, Fukumoto describes that different vibration modes are generated based upon which touch button was pressed, and that the actuator signal that is generated produces a vibration mode linked with the type of touch button: : As explained above, according to the first example of this embodiment, in the case of detecting a touch operation on the touch panel 102, the CPU 113 first detects a touched position and identifies the operated touch button. The CPU 113 then causes vibration to be generated from the oscillatory actuator 115 by a vibration mode linked with the type of the touch button. Ex at 29; Ex at [0266], (emphasis added) This is further explained with respect to Figure 33, shown below: WEST\ EXHIBIT PAGE 109

110 Ex & Ex at Figure Figure 33 shows a first lookup tablee corresponding to a PDA screen of Figure 32, displaying Touch buttons labeled A, B, C... The lookup table of Figure 33 including entries for individual touch buttons, and their corresponding area or position on the screen. It further includess sets of waveform data that is associated with each of the different touch buttons: Next, FIG. 33 is a view illustrating a waveform data a table 112a stored in the memory 112 in the PDA 10. As shown in the figure, the waveform data table 112a stores, for each touch button displayed on the screen, area data showing the area occupied by a touch button on the touch panel 102 using XY coordinatess as well as waveform data of the drive signal to be applied to the oscillatory actuator 115 when that touch button is pressed. Ex at 31; Ex at [0286], (emphasis added) A person of ordinary skilll in the art at the time of the alleged invention would understand the waveform data table such as is shown in Figure WEST\ EXHIBIT PAGE 110

111 33 to disclose a lookup table, and would further understand the sets of waveform data contained in the table to be haptic effect data in a lookup table Thus, as shown above, Fukumoto discloses that the processor comprising CPU 113 is configured to generate the actuator signal based at least in part on the interaction and haptic effect data in a lookup table As a further example, Figure 46 of Fukumoto shows a lookup table corresponding to a PDA screen of Figure 32, displaying Touch buttons labeled A, B, C... The lookup table of Figure 46 includes entries for individual touch buttons, their corresponding area or position on the screen, and pressure-level classifications, i.e. touch operation (lower contact pressure) and pressing operations (higher contact pressure). It further includes sets of waveform data that are associated with each of the different touch buttons, and with two different contact pressure levels for each button Figure 46 is reproduced below: WEST\ EXHIBIT PAGE 111

112 Ex & Ex at Figure Fukumoto explains that the different waveform data used to produce the different actuator signals are stored in this table, and are retrieved based in part on the determined interactions, including both the positional and pressure-level interactions: FIG. 46 is a view illustrating a waveform data table stored in a memory in a PDA according to a seventh embodiment of the present invention. Ex at 11; Ex at [0103]. Further, the memory 112 of the PDA 10 according to the present embodiment stores the waveform data table 112d shown in FIG. 46. The waveform data table 112d corresponds to the example of the screen display of the touch buttons shown in FIG. 32. The waveform data table 112d stores for each touch button the area data for the touch button and the waveform data to be applied to the oscillatory actuator 115 for each of the cases when the touch button is subjected to a touch operation and when it is subjected to a pressing operation. WEST\ EXHIBIT PAGE 112

113 Ex at 31; Ex at [0288], (emphasis added). In a PDA 10 having this configuration, when a touch operation is performed on the touch panel, the touch panel outputs to the CPU 113 a touch signal showing that a touch operation has been performed. The CPU 113 finds the coordinate data of the touched position based on the touch signal and identifies the touch button operated referring to the waveform data table 112d. Next, the CPU 113 reads the waveform data for the touch operation linked with the identified touch button from the waveform data table 112d. The CPU 113 then drives the oscillatory actuator 115 using the drive signal generated by the read waveform data. The same is true for the case when a pressing operation is performed on the touch panel. The CPU 113 reads the waveform data for the pressing operation linked with the operated touch button from the waveform data table 112d and drives the oscillatory actuator 115. Ex at 31; Ex at [0289], (emphasis added) A person of ordinary skill in the art at the time of the alleged invention would understand the waveform data table such as is shown in Figure 46 to disclose a lookup table, and would further understand the sets of waveform data contained in the table to be haptic effect data in a lookup table Thus, as shown above, Fukumoto discloses that the processor comprising CPU 113 is configured to generate the actuator signal based at least in part on positional and pressure-level interactions and haptic effect data in a lookup table, and thus discloses element 12g A further example is disclosed with respect to the simulated slider control button that may be displayed on the screen of PDA 10, as depicted in Figure 41. Figure 41 depicts a simulated slider control using a knob that can be WEST\ EXHIBIT PAGE 113

114 draggedd along the scale as shown below. Figuree 42 illustrates a lookup table used to generate the actuator signal in conjunction with this displayed function, and both Figures are reproduced below: Ex & Fukumoto at Figure Fukumoto describes how the graphical objects shown above can simulate, for example, a sliding control button for adjusting a parameter such as audio volume, or screen brightness: FIG. 41 is a view of an examplee of the screen display of a PDA 10 according to a third example of this embodiment. As shown in the figure, the display screen of the PDA 10 displays a scale and knob for adjusting the value of a parameter such as the levell of sound of the PDA 10 or the brightness of the screen.. The user can drag and change the position of the knob displayedd on the screen by a touch operation on the touch panel 102. Ex at 30; Ex at [0277], (emphasis added). WEST\ EXHIBIT PAGE 114

115 271. Fukumoto discloses that a lookup table is used to associate specified parameter ranges along the sliding scale with different haptic effect, which is used on part to generatee the actuator signal. This lookup table is depicted in Figure 42, which is reproduced below: Ex & Ex at Figure Fukumoto discloses that the particular set of waveform data used to generatee different actuator signals is selected based on the determinedd position of the knob with respect to four parameter value ranges of the sliding scale, as shown above. Thus this disclosure representss another example of the CPU configured to generatee the actuator signal based at least in partt on the determined interaction, at least a positional interaction n The lookup table shown in Figure 42 further includes sets of waveform data that are associated with each of the different parameter value ranges: WEST\ EXHIBIT PAGE 115

116 Next, when the user moves his or her fingertip while in contact with the touch panel 102 and drags the knob along the scale, the CPU 113 recognizes that the knob is being dragged. Here, the memory 112 of the PDA 10 according to the third example of this embodiment stores the waveform data table 112c shown in FIG. 42. This waveform data table 112c divides the range of values which the parameter can take into several sections and stores waveform data of the drive signal to be applied to the oscillatory actuator 115 for each section. Ex at 30; Ex at [0279], (emphasis added) The CPU 113 obtains the value of the parameter in accordance with the position of the dragged knob, reads the waveform data linked with the current value of the parameter from the waveform data table 112c, and drives the oscillatory actuator 115. Therefore, while the knob is being dragged, the fingertip of the user performing the touch operation or the hand of the user holding the PDA 10 is given vibration in accordance with the value of the parameter. Ex at [0280] A person of ordinary skill in the art at the time of the alleged invention would understand the waveform data table such as is shown in Figure 42 to disclose a lookup table, and would further understand the sets of waveform data contained in the table to be haptic effect data in a lookup table Thus, as shown above, Fukumoto discloses that the processor comprising CPU 113 is configured to generate the actuator signal based at least in part on positional and/or contact-movement interactions and haptic effect data in a lookup table, and thus discloses element 12g. WEST\ EXHIBIT PAGE 116

117 277. As final example, Figures andd accompanying textt disclose a lookup table containing function data (operation instruction), such as select icon, drag, and delete data. This aspect of Fukumoto s s system is describedd with respect to graphical icons being displayed on the PDA of Fukumoto and shown in Figures 38-39, which depict folder and trash icons, familiar to most users of computing devices. These figures are reproduced d below: 278. As explained by Fukumoto, these figures depict a drag and drop operation, as would be used when deleting the folder by dropping it in the trash bin: FIG.38 and FIG.39 are views illustrating the state where a user is dragging an icon displayed on the display screen of the PDA 10 by a touch operation on the touch panel 102 to transfer it to the trash. Note that the trash spoken of here is an icon for nstructing deletion of data. Ex at 29; Ex at [270], (emphasis added). WEST\ EXHIBIT PAGE 117

118 279. Fukumoto describes operation instructions such as select, drag, delete data associated with this action: First, when a user selects the icon desired to be dragged by the touch operation on the touch panel 102, the CPU 113 of the PDA 10 detects the touched position and identifies the touch operation as an instruction for the selection of the icon. The memory 112 of the PDA 10 stores the waveform data table 112b, ass shown in FIG. 40, storing the waveform data of the drive signal to bee applied to the oscillatory actuator 115 for each type of instruction designated by an operation input. Ex at 29; Ex at [271] The lookup table referenced above with respect to this operation is shown in Figure 40 of Fukumoto, which is reproduced below: Ex & Ex at Figure 40. WEST\ EXHIBIT PAGE 118

119 281. As can be observed in this figure, a variety of different functional instructions corresponding to different actions are stored in the lookup table, including Enter, Cancel, Click, Drag, Select Icon, etc. I note that these correspond to the functions that are disclosed and claimed in the 356 patent as being in its lookup table Fukumoto explains that the first step of the drag and drop operation depicted in Figures is to select the folder icon by touching it with a finger. CPU 113 detects this instruction, and lookup table 112b associates this function with a first set of waveform data, which results in a first actuator signal and haptic effect being generated: The CPU 113 reads the waveform data linked with SELECT ICON from the waveform data table 112b and drives the oscillatory actuator 115. As a result, the fingertip of the user performing the touch operation or the hand of the user holding the PDA 10 is given vibration indicating that the icon is selected. Ex at 29; Ex at [0272], (emphasis added) Fukumoto explains that the next step of the drag and drop operation depicted in Figures is to drag the folder icon while maintaining contact with the finger. Lookup table 112b associates this drag function with a different set of waveform data, which results in a different actuator signal and haptic effect being generated: WEST\ EXHIBIT PAGE 119

120 Further, as shown in FIG. 38, when the user moves his or her fingertip while in contact with the touch panel 102 to drag the selected icon, the CPU 113 identifies the touch operation as an instruction for the dragging of the icon. Therefore, the CPU 113 reads the waveform data linked with DRAG from the waveform data table 112b and drives the oscillatory actuator 115. Due to this, vibration showing that a drag operation is under way is transmitted to the user. For example, when a drag operation is under way, it is preferable to continuously give a weak vibration. Ex at 29; Fukumoto at [0273], (emphasis added) Fukumoto explains that the final step of the drag and drop operation depicted in Figures is to perform the functional instruction to delete data of the folder icon by dragging it on top of the trash icon. This is shown in Figure 29. CPU 113 detects this operation, and lookup table 112b associates this function with another set of waveform data, which results in yet a different actuator signal and haptic effect being generated: Further, as shown in FIG. 39, when the dragged icon is superposed over the trash, the CPU 113 identifies the touch operation as instruction for the placement of the icon in the trash. Therefore, while the CPU 113 stores the icon in the trash, it reads the waveform data linked with DELETE DATA from the waveform data table 112b and drives the oscillatory actuator 115. As a result, the user performing the touch operation is given vibration indicating deletion of the icon. Ex at 29; Fukumoto at [0274] As I have established above, Fukumoto discloses that the processor will generate the actuator signal based at least in part on haptic effect data in a lookup table. The haptic data comprising the sets of waveform data stored in the WEST\ EXHIBIT PAGE 120

121 lookup tables of Fukumoto are retrieved based on interaction entries in to the table including determined positional and pressure-contact interactions between the object contacting the surface and graphical objects displayed on the screen. These same two types of interactions are disclosed and claimed in the 356 patent Further, the disclosed lookup tables of Fukumoto also associate sets of haptic effect data with functional operations, such as selecting, deleting data, dragging, etc., as shown in Figure 40. I note that these same types of functions in a lookup table were subsequently disclosed and claimed in the 356 patent As established above, each of the lookup tables is a data structure in the form of a table containing associations between interactions and haptic effect data. Specifically, Fig. 46 and accompanying text is a discloses a data structure in the form of a table (Fig. 46) containing associations between interactions (positional and pressure interactions on graphical objects) and haptic effect data (waveforms). Ex at 31; Ex at [0289]. Figure 40 and accompanying text discloses a data structure in the form of a table (Fig. 40) containing associations between interactions (functions) and haptic effect data (waveforms). Ex at 29; Ex at [0272]. Figures 41, 42 and accompanying text discloses a data structure in the form of a table (Figs. 41, 42) containing associations between interactions (parameter values, positional interaction) and haptic effect data (waveforms). Ex at 30; Ex at [0279]. And Figure WEST\ EXHIBIT PAGE 121

122 33 and accompanying text discloses a data structure in the form of a ta ble (Fig. 33) containing associations between interactions (selected touch buttons) and haptic effect data (waveforms). Ex at 31; Ex at [0286]. Limitation 12.h: transmit the actuator signal to the actuator Fukumoto discloses the processor transmits the actuator signal to the actuator As I showed above in my analysis directed to claim element 12c, which I incorporate here by reference, Fukumoto discloses a processor comprising CPU 113, as shown in Figure 2: Ex & Ex at Figure As I established above with respect to Figure 2, the piezoelectric oscillatory actuator115 is configured to receive an actuator signal delivered via drive signal generation circuit 114: WEST\ EXHIBIT PAGE 122

123 Next, FIG. 2 is a block diagram illustrating the hardware configuration of the PDA 10 shown in FIG. 1. As shown in this figure, the PDA 10 has a touch panel 102, a display unit 103, a key input unit 111, a memory 112, a CPU (central processing unit) 113, a drive signal generation circuit 114, and an oscillatory actuator 115. Ex at 13; Ex at [0147] Fukumoto describes how this actuator signal is transmitted to the actuator and causes the oscillatory actuator 115 to generate a haptic effect, such as a vibration: The CPU 113 executes a program stored in the memory 112 to control the parts of the device interconnected through a bus 116. This CPU 113 executes a vibration control processing 1 (see FIG. 5). Upon detection of an operation input from the touch panel 102 or any one of operation keys 104a to 104c, it drives the oscillatory actuator 115 through the drive signal generation circuit 114 to cause the touch panel 102 or one of the operation keys 104a to 104c to vibrate. Ex at 13-14; Ex at [0149]. Next, the CPU 113 outputs the waveform data read from the memory 112 to the drive signal generation circuit 114. At the same time, the CPU 113 instructs the drive signal generation circuit 114 to generate a drive signal (step S103). In response to the processing of step S103, the drive signal generation circuit 114 generates a drive signal using the waveform data supplied from the CPU 113. Ex at 16-17; Ex at [0170]. Further, the present invention provides an electronic device provided with an operating unit for receiving an operation input, a vibration generator able to give vibration to a user and simultaneously cause the generation of sound, and drive control means for combining a drive signal for driving the vibration generator to cause generation of vibration and an audio signal for driving the vibration generator to cause generation of sound and applying the combined signal to the vibration generator in the case of causing generation of vibration and WEST\ EXHIBIT PAGE 123

124 sound from the vibration generator in the case of detecting that an operation input to the operating unit has been received. Ex at 8; Ex at [0045] As shown above, Fukumoto discloses that the processor comprising CPU 113 is configured to transmit the actuator signal to the actuator. Claim 13: The system of claim 12, wherein the processor is configured to generate the actuator signal when the object contacts the touch-sensitive input device at a location corresponding to the graphical object Fukumoto discloses a processor that is configured to generate the actuator signal when the object contacts the touch-sensitive input device at a location corresponding to the graphical object. This was described in detail in my analysis of claim elements 12f and 12g, which I incorporate here by reference. By way of summary, it is described, for example, in the following passages: Next, FIG. 2 is a block diagram illustrating the hardware configuration of the PDA 10 shown in FIG. 1. As shown in this figure, the PDA 10 has a touch panel 102, a display unit 103, a key input unit 111, a memory 112, a CPU (central processing unit) 113, a drive signal generation circuit 114, and an oscillatory actuator 115. Ex at 13; Ex at [0147]. WEST\ EXHIBIT PAGE 124

125 Ex & Ex at Figure 2. The CPU 113 executes a program stored in the memory 112 to control the parts of the device interconnected through a bus 116. This CPU 113 executes a vibration control processing 1 (see FIG. 5). Upon detection of an operation input from the touch panel 102 or any one of operation keys 104a to 104c, it drives the oscillatory actuator 115 through the drive signal generation circuit 114 to cause the touch panel 102 or one of the operation keys 104a to 104c to vibrate. Ex at 13-14; Ex at [0149]. Next, the CPU 113 outputs the waveform data read from the memory 112 to the drive signal generation circuit 114. At the same time, the CPU 113 instructs the drive signal generation circuit 114 to generate a drive signal (step S103). In response to the processing of step S103, the drive signal generation circuit 114 generates a drive signal using the waveform data supplied from the CPU 113. Ex at 16-17; Ex at [0170]. Further, the present invention provides an electronic device provided with an operating unit for receiving an operation input, a vibration generator able to give vibration to a user and simultaneously cause the generation of sound, and drive control means for combining a drive WEST\ EXHIBIT PAGE 125

126 signal for driving the vibration generator to cause generation of vibration and an audio signal for driving the vibration generator to cause generation of sound and applying the combined signal to the vibration generator in the case of causing generation of vibration and sound from the vibration generator in the case of detecting that an operation input to the operating unit has been received. Ex at 8; Ex at [0045]. Further, the memory 112 of the PDA 10 according to the present embodiment stores the waveform data table 112d shown in FIG. 46. The waveform data table 112d corresponds to the example of the screen display of the touch buttons shown in FIG. 32. The waveform data table 112d stores for each touch button the area data for the touch button and the waveform data to be applied to the oscillatory actuator 115 for each of the cases when the touch button is subjected to a touch operation and when it is subjected to a pressing operation. Ex at 31; Ex at [0288]. In a PDA 10 having this configuration, when a touch operation is performed on the touch panel, the touch panel outputs to the CPU 113 a touch signal showing that a touch operation has been performed. The CPU 113 finds the coordinate data of the touched position based on the touch signal and identifies the touch button operated referring to the waveform data table 112d. Next, the CPU 113 reads the waveform data for the touch operation linked with the identified touch button from the waveform data table 112d. The CPU 113 then drives the oscillatory actuator 115 using the drive signal generated by the read waveform data. The same is true for the case when a pressing operation is performed on the touch panel. The CPU 113 reads the waveform data for the pressing operation linked with the operated touch button from the waveform data table 112d and drives the oscillatory actuator 115. Ex at 31; Ex at [0289] As further evidence supporting my opinion that Fukumoto discloses this limitation, see also: Ex at [0050], [0052], [0148], [ ], [ ], WEST\ EXHIBIT PAGE 126

127 [0168], [ ], [0173], [ ], [0196], [0198], [0204], [0205], [0210], [0213], [0225], [0227], [0229], [0248], [ ], [0271], [0279], [ ], [ ], [0303], [ ], [0312], [0332], [0338], [ ], [ ], [ ], [0358], [0366], [0370], [0377], [ ], [0383], [0386], [ ], [ ], [0406], [0414], [0420], [0424]; Figs. 5-6, 11, 13, 14-15, 18, 47, 48, 52, 57-58, 63, 82-83; claims 3, 11, 20, 48, 59, 69, 79, 91, 96-98, 99, 129, 131. See also, Fukumoto PCT at 1, 16-18, 21, 28-29, 31 ( Herein, the waveform data table 112d illustrated in Fig. 46 is stored in the memory 112 of the PDA 10 according to this embodiment. This waveform data table 112d corresponds to a screen display example of a touch button illustrated in Fig. 32. The area data of each touch button, and the waveform data that is applied to the vibration actuator 115 when the touch button is touch operated and when the touch button is pressing operated is stored in the waveform data table 112d. ), 32, 39, 40; Figs. 2, 5-6, 11, 13, 14-15, 18, 47, 48, 52, 57-58, 63, 82-83; claims 3, 11, 20, 48, 59, 69, 79, 91, 96-98, 99, 129, 131. Claim 15: The system of claim 12, wherein the display signal is configured to display a keypad comprising a plurality of softkeys Fukumoto discloses the display signal is configured to display a keypad comprising a plurality of softkeys. This is shown, for example, in Figure 32, which is reproduced below, with accompanying description: WEST\ EXHIBIT PAGE 127

128 Ex & Ex at Figure 32. FIG. 32 is a view of an example of the screen display of the PDA 10 according to a first example of present embodiment. As shown in the figure, the display screen of the PDA 10 shows a plurality of touch buttons "A" to "G". The touch panel 102 overlaid on the display screen detects a touch operation when a user touches a displayed touch button by his or her fingertip. Note that the letters assigned to the touch buttons are only given for identifying the touch buttons. Ex at 28; Ex at [0261] In addition to the PDA embodiments, Fukumoto further discloses that the disclosed invention may be utilized as a touchscreen for many other types of devices, including d ATMs, notebook and mobile computers, calculators, wristwatches, and other portable devices: In the first embodiment to the 12th embodiments, the explanation was made of the case of application of the present invention to a PDA or an ATM. The present invention however of course may also be applied to for example a mobile phone, electronic notebook, mobile computer, wristwatch, electronic calculator, remote controller of an WEST\ EXHIBIT PAGE 128

129 electronic device, and other various types of portable electronic devices. Further, the present invention may also be applied to a stationary type computer or a vending machine, cash register, car navigation system, household electric appliance, or other of various types of electronic devices not having portability. Ex at 50; Ex at [0430], (emphasis added). Further, the aspect of the invention according to the present embodiment can of course also be applied to a mobile phone serviced by a PDC (personal digital cellular) type mobile packet communication network or a PHS (personal handyphone system (registered trademark) terminal. Ex at 36; Ex at [0325] A person of ordinary skill in the art would understand that many of these devices inherently disclose a display signal that is configured to display a keypad comprising a plurality of softkeys As established above, among the devices that Fukumoto discloses can be used with his invention are mobile or cellular phones. See, e.g., Ex at [0430] and [0325]; Ex at 36 & 50. As would be understood by one of ordinary skill in the art, a cellular telephone implemented with a touch screen and without mechanical keys inherently discloses at least a numeric keypad for dialing phone numbers, entering responses to voice menus, and the like. To the extent that it is argued that this disclosure of soft keys comprising numeric key pads in the context of cellular phones is not inherent, it would have been obvious to a person of ordinary skill in the art to provide a numeric keypad on Fukumoto s touchscreen WEST\ EXHIBIT PAGE 129

130 when used with a cellular phone for dialing phone numbers As further established above, another device that Fukumoto discloses can be used with his invention are ATMs. See, e.g., Ex at 36; Ex at [0430] As would be understood by one of ordinary skill in the art, an ATM implemented with a touch screen and without mechanical keys inherently discloses at least a numeric keypad to be able to function for its intended application. One of ordinary skill in the art would understand that displaying a numeric keypad for use on the disclosed ATM touchscreen would at least comprise displaying a plurality of keys comprising one softkey for each digit from 0 to 9 so that a user could provide numerical entries for things like PINs, requested dollar amounts, and other financial data As established above, another device that Fukumoto discloses can be used with his invention are calculators. See, e.g., Ex at 36; Ex at [0430]. As would be understood by one of ordinary skill in the art, a calculator implemented with a touch screen and without mechanical keys inherently discloses at least a numeric keypad for arithmetic entries, or it would not be able to function for its intended purpose. To the extent that it is argued that this disclosure of soft keys comprising numeric key pads in the context of calculators is not inherent, it would have been obvious to a person of ordinary skill in the art to provide a WEST\ EXHIBIT PAGE 130

131 numeric keypad on Fukumoto s calculator touchscreen to permit number entry for arithmetic operations Thus, to the extent this reference does not expressly disclose this limitation, it inherently discloses it, because a keyboard with a plurality of softkeys is required on the disclosed touch panel PDAs for the disclosed functionality of inputting words, the disclosed ATMs for inputting PIN codes and dollar amounts, the disclosed cell phones for making phone calls, the disclosed calculators for entering numbers, etc. Claim 17: The system of claim 15, wherein the plurality of softkeys comprises one softkey for each digit from 0 to Fukumoto inherently discloses a display signal configured to display a keypad comprising a plurality of softkeys comprising one softkey for each digit from 0 to 9, for the same reasons as set forth above with respect to claim 15, which analysis I incorporate here by reference Alternatively, this claim is rendered obvious by the disclosure of Fukumoto on its own and in combination with the knowledge of one of ordinary skill in the art, or by the disclosure of Fukumoto in view of Tsuji, for the same reasons as set forth above with respect to claim 15, which analysis I incorporate here by reference. Claim 18: The system of claim 15, wherein the plurality of softkeys comprises the key configuration of a standard 101-key keyboard. WEST\ EXHIBIT PAGE 131

132 305. I am informed that U.S. Pat. No. 5,575,576 ( Roysden ) issued on November 19, 1996 and, therefore, is prior art too the 356 patent under 35 U.S.C. 102(b) Fukumoto expressly teaches to implement its disclosed touchscreen system in mobile computers and stationary computers. Ex. 1108, 50; Ex. 1109, [0430]. In implement ting a touchscreen forr such computers, a person of ordinary skill in the art would have had to select the configuration to use for the computer s soft keyboard. It would have been obvious to use the standard 101- key keyboard configurationn introduced by IBM in 1986 because that was the standardd for the PC industry for many years. Exx 1113; Ex (reproduced below) Roysden teaches that it would have been desirable to use the standard 101-key IBM keyboard configuration for computers because users were familiar WEST\ EXHIBIT PAGE 132

133 with it. Ex. 1115, 5:7-26. In my opinion, it would have been obvious to configure the soft keyboard for the computer application taught by Fukumoto using the standard 101-key IBM keyboard configuration. Claim 19: The system of claim 12, wherein the graphical object comprises a first graphical object and a second graphical object, the haptic effect comprises a first haptic effect and a second haptic effect, and wherein the first haptic effect is configured to be output when the object contacts the first graphical object, and the second haptic effect is configured to be output when the object contacts the second graphical object Fukumoto discloses the graphical object comprises a first graphical object and a second graphical object, the haptic effect comprises a first haptic effect and a second haptic effect, and wherein the first haptic effect is configured to be output when the object contacts the first graphical object, and the second haptic effect is configured to be output when the object contacts the second graphical object As I showed above with respect to Figure 2 and related disclosure, the piezoelectric oscillatory actuator 115 is configured to receive an actuator signal delivered via drive signal generation circuit 114. Fukumoto describes that this actuator signal is generated at least in part by an interaction, including, for example, the positional and pressure-level interactions As explained by Fukumoto, the haptic effect that is generated in response to a touch operation is different based on the graphical object contacted by the user, or based on other interaction with the graphical object, such as the WEST\ EXHIBIT PAGE 133

134 position of the knob or pressure applied. This is shown, for example, in Ex at Figs. 32, 33, 42-48; [0259]-[0293]; Ex at Figs. 1-5, 32-48, 13-18, These interactions are associated with haptic effects in a lookup table, as I will discuss further below. Id. The signals are received by the CPU, the CPU detects the interaction, and the CPU executing programmable code from memory generates the actuator signal based on the detected interaction and the waveform in the lookup table. Id.; Ex and Ex at Figs. 1-2, 5-6, 13-14, 17-18, and accompanying text For example, Fukumoto describes that different vibration modes are generated based upon which touch button was pressed: As explained above, according to the first example of this embodiment, in the case of detecting a touch operation on the touch panel 102, the CPU 113 first detects a touched position and identifies the operated touch button. The CPU 113 then causes vibration to be generated from the oscillatory actuator 115 by a vibration mode linked with the type of the touch button. Ex at 29; Ex at [0266], (emphasis added) This is further explained with respect to Figure 33, shown below: WEST\ EXHIBIT PAGE 134

135 Ex & Fukumoto at Figure Figure 33 shows a first lookup tablee corresponding to a PDA screen of Figure 32, displaying Touch buttons labeled A, B, C... The lookup table of Figure 33 including entries for individual touch buttons, and their corresponding area or position on the screen: Next, FIG. 33 is a view illustrating a waveform data a table 112a stored in the memory 112 in the PDA 10. As shown in the figure, the waveform data table 112a stores, for each touch button displayed on the screen, area data showing the area occupied by a touch button on the touch panel 102 using XY coordinatess as well as waveform data of the drive signal to be applied to the oscillatory actuator 115 when that touch button is pressed. Ex at 31; Fukumoto at [0286], (emphasis added) Thus, as shown above, Fukumoto discloses, wherein the graphical object comprises a first graphical object and a second graphical object, the haptic WEST\ EXHIBIT PAGE 135

136 effect comprises a first haptic effect and a second haptic effect, and wherein the first haptic effect is configured to be output when the object contacts the first graphical object, and the second haptic effect is configured to be output when the object contacts the second graphical object As a further example, Figure 46 of Fukumoto shows a lookup table corresponding to a PDA screen of Figure 32, displaying Touch buttons labeled A, B, C... The lookup table of Figure 46 includes entries for individual touch buttons, their corresponding area or position on the screen, and pressure-level classifications, i.e. touch operation (lower contact pressure) and pressing operations (higher contact pressure). Figure 46 is reproduced below: Ex & Ex at Figure Fukumoto explains that the different waveform data used to produce the different actuator signals are stored in this table, and are retrieved based in part WEST\ EXHIBIT PAGE 136

137 on the determined interactions, including both the positional and pressure-level interactions: FIG. 46 is a view illustrating a waveform data table stored in a memory in a PDA according to a seventh embodiment of the present invention. Ex at 11;Ex at [0103]. Further, the memory 112 of the PDA 10 according to the present embodiment stores the waveform data table 112d shown in FIG. 46. The waveform data table 112d corresponds to the example of the screen display of the touch buttons shown in FIG. 32. The waveform data table 112d stores for each touch button the area data for the touch button and the waveform data to be applied to the oscillatory actuator 115 for each of the cases when the touch button is subjected to a touch operation and when it is subjected to a pressing operation. Ex at 31; Ex at [0288], (emphasis added). In a PDA 10 having this configuration, when a touch operation is performed on the touch panel, the touch panel outputs to the CPU 113 a touch signal showing that a touch operation has been performed. The CPU 113 finds the coordinate data of the touched position based on the touch signal and identifies the touch button operated referring to the waveform data table 112d. Next, the CPU 113 reads the waveform data for the touch operation linked with the identified touch button from the waveform data table 112d. The CPU 113 then drives the oscillatory actuator 115 using the drive signal generated by the read waveform data. The same is true for the case when a pressing operation is performed on the touch panel. The CPU 113 reads the waveform data for the pressing operation linked with the operated touch button from the waveform data table 112d and drives the oscillatory actuator 115. Ex at 31; Ex at [0289], (emphasis added) Thus, as shown above, Fukumoto discloses, wherein the graphical object comprises a first graphical object and a second graphical object, the haptic WEST\ EXHIBIT PAGE 137

138 effect comprises a first haptic effect and a second haptic effect, and wherein the first haptic effect is configured to be output when the object contacts the first graphical object, and the second haptic effect is configured to be output when the object contacts the second graphical object As a final example of disclosure by Fukumoto of this claim element, Fukumoto explains how the PDA 10 transmits to the user vibration for the touch operation linked with the type of touch button contacted by the fingertip: By adopting this configuration, when for example the user moves his or her fingertip in a state contacting but not pressing against the touch panel to find the position of a touch button, the PDA 10 transmits to the user vibration for the touch operation linked with the type of touch button contacted by the fingertip. That is, in the state where the user is searching the position of a touch button, for example, a weak vibration that differs for each type of touch button is transmitted to the user. On the other hand, when the user finds the desired touch button and presses against the touch button, the PDA 10 transmits to the user vibration for the pressing operation linked with the type of the touch button. That is, when the user presses against a touch button, vibration showing that the operation has been received is given to the user. Ex at 31-32; Ex at [0290], (emphasis added) Fukumoto further explains that a range of different sets of waveforms are available and stored in memory for providing these differentiated haptic effects: In the first embodiment, the memory 112 stores a plurality of types of waveform data. Ex at 46; Ex at [0405]. WEST\ EXHIBIT PAGE 138

139 320. As further evidence supporting my opinion that Fukumoto discloses this limitation, see also: Ex at [0169], [0179], [0199], [0232], [ ], [ ], [ ], [ ], [0291], [0329], [ ], [ ], [0348], [0353], [ ], [ ], [0370]; Figs. 6, 13, 17, 33, 38-40, 41, 46-47, 52, 54, 62-63, See also Fukumoto PCT at 10, 11, 13-16, 18, 19, 23-25, 27-30, 31 (e.g., Herein, the waveform data table 112d illustrated in Fig. 46 is stored in the memory 112 of the PDA 10 according to this embodiment. This waveform data table 112d corresponds to a screen display example of a touch button illustrated in Fig. 32. The area data of each touch button, and the waveform data that is applied to the vibration actuator 115 when the touch button is touch operated and when the touch button is pressing operated is stored in the waveform data table 112d. ), 33, 34, 39, 46; Figs. 2, 6, 13, 17, 32, 33, 38-40, 41, 46-47, 52, 54, 62-63, Thus, as shown by the numerous descriptions above, Fukumoto discloses, wherein the graphical object comprises a first graphical object and a second graphical object, the haptic effect comprises a first haptic effect and a second haptic effect, and wherein the first haptic effect is configured to be output when the object contacts the first graphical object, and the second haptic effect is configured to be output when the object contacts the second graphical object. Claim 20: The system of claim 12, wherein the haptic effect data comprises a WEST\ EXHIBIT PAGE 139

140 plurality of haptic effects Fukumoto discloses the haptic effect data comprises a plurality of haptic effects As I showed above with respect to Figure 2 and related disclosure, the piezoelectric oscillatory actuator 115 is configured to receive an actuator signal delivered via drive signal generation circuit 114. Fukumoto describes that this actuator signal is generated at least in part by an interaction, including, for example, the positional, pressure-level and key-level interactions As explained by Fukumoto, the haptic effect that is generated in response to a touch operation is different based on the graphical object contacted by the user, or based on other interaction with the graphical object, such as the position of the knob, pressure applied or function associated with the graphical object. Thus, the haptic effect data in the lookup table comprises a plurality of different haptic effects Fukumoto expressly confirms that the haptic effect data in the lookup table comprises a plurality of different haptic effects: In the first embodiment, the memory 112 stores a plurality of types of waveform data. Ex at 46; Ex at [0405] This is shown, for example, in Ex at Figs. 32, 33, 42-48; [0259]-[0293]; Ex at Figs. 1-5, 32-48, 13-18, These interactions are WEST\ EXHIBIT PAGE 140

141 associated with haptic effects in a lookup table, as I will discuss further below. Id. The signals are received by the CPU, the CPU detects the interaction, and the CPU executing programmable code from memory generates the actuator signal based on the detected interaction and the waveform in the lookup table. Id.; Ex and Ex at Figs. 1-2, 5-6, 13-14, 17-18, and accompanying text For example, Fukumoto describes that different vibration modes are generated based upon whichh touch button was pressed: As explained above, according to the first example of this embodiment, in the case of detecting a touch operation on the touch panel 102, the CPU 113 first detects a touched position and identifies the operated touch button. The CPU 113 then causess vibration to be generated from the oscillatory actuator 1155 by a vibration mode linked with the type of the touch button. Ex at 29; Ex at [0266], (emphasis added) This is further explained with respect to Figure 33, shown below: WEST\ EXHIBIT PAGE 141

142 Ex & Ex at Figure Figure 33 shows a first lookup table corresponding to a PDA screen of Figure 32, displaying Touch buttons labeled A, B, C... The lookup table of Figure 33 including entries for individual touch buttons, and their corresponding area or position on the screen: Next, FIG. 33 is a view illustrating a waveform data table 112a stored in the memory 112 in the PDA 10. As shown in the figure, the waveform data table 112a stores, for each touch button displayed on the screen, area data showing the area occupied by a touch button on the touch panel 102 using XY coordinates as well as waveform data of the drive signal to be applied to the oscillatory actuator 115 when that touch button is pressed. Ex at 31; Fukumoto at [0286], (emphasis added) Thus, as shown above, Fukumoto discloses that the haptic effect data in the lookup table comprises a plurality of different haptic effects As a further example, Figure 46 of Fukumoto shows a lookup table corresponding to a PDA screen of Figure 32, displaying Touch buttons labeled A, B, C... The lookup table of Figure 46 includes entries for individual touch buttons, their corresponding area or position on the screen, and pressure-level classifications, i.e. touch operation (lower contact pressure) and pressing operations (higher contact pressure). Figure 46 is reproduced below: WEST\ EXHIBIT PAGE 142

143 Ex & Ex at Figure Fukumoto explains that the different waveform data used to produce the different actuator signals are stored in this table, and are retrieved based in part on the determined interactions, including both the positional and pressure-level interactions: FIG. 46 is a view illustrating a waveform data table stored in a memory in a PDA according to a seventh embodiment of the present invention. Ex at 11; Ex at [0103]. Further, the memory 112 of the PDA 10 according to the present embodiment stores the waveform data table 112d shown in FIG. 46. The waveform data table 112d corresponds to the example of the screen display of the touch buttons shown in FIG. 32. The waveform data table 112d stores for each touch button the area data for the touch button and the waveform data to be applied to the oscillatory actuator 115 for each of the cases when the touch button is subjected to a touch operation and when it is subjected to a pressing operation. WEST\ EXHIBIT PAGE 143

144 Ex at 31; Ex at [0288], (emphasis added). In a PDA 10 having this configuration, when a touch operation is performed on the touch panel, the touch panel outputs to the CPU 113 a touch signal showing that a touch operation has been performed. The CPU 113 finds the coordinate data of the touched position based on the touch signal and identifies the touch button operated referring to the waveform data table 112d. Next, the CPU 113 reads the waveform data for the touch operation linked with the identified touch button from the waveform data table 112d. The CPU 113 then drives the oscillatory actuator 115 using the drive signal generated by the read waveform data. The same is true for the case when a pressing operation is performed on the touch panel. The CPU 113 reads the waveform data for the pressing operation linked with the operated touch button from the waveform data table 112d and drives the oscillatory actuator 115. Ex at 31; Ex at [0289], (emphasis added) Thus, as shown above, Fukumoto discloses that the haptic effect data in the lookup table comprises a plurality of different haptic effects As a final example of disclosure by Fukumoto of this claim element, Fukumoto explains how the PDA 10 transmits to the user vibration for the touch operation linked with the type of touch button contacted by the fingertip: By adopting this configuration, when for example the user moves his or her fingertip in a state contacting but not pressing against the touch panel to find the position of a touch button, the PDA 10 transmits to the user vibration for the touch operation linked with the type of touch button contacted by the fingertip. That is, in the state where the user is searching the position of a touch button, for example, a weak vibration that differs for each type of touch button is transmitted to the user. On the other hand, when the user finds the desired touch button and presses against the touch button, the PDA 10 transmits to the user vibration for the pressing operation linked with the type of the touch button. That is, when the user presses against a touch button, WEST\ EXHIBIT PAGE 144

145 vibration showing that the operation has been received is given to the user. Ex at 31-32; Ex at [0290], (emphasis added) As is evident from the disclosures above, the lookup tables in Fukumoto include a plurality of haptic effects, and thus Fukumoto discloses that the haptic effect data in the lookup table comprises a plurality of different haptic effects In addition, I note that the specific haptic effect data on which a single haptic effect is produced may comprise a plurality of haptic effects. For example, Fukumoto discloses that the CPU can generate an actuator signal by synthesizing a signal from two waveforms in the table (Ex at 30; Ex at [0281]) (e.g. a plurality of haptic effects) or based on the input parameter (Ex at 30-31; Ex at [0282]-[0284]) As further evidence supporting my opinion that Fukumoto discloses this limitation, see also: Ex at 16; Ex at [0165] ( Note that the frequency f, of the drive signal may also be set so that a frequency which is an integral multiple of the frequency f, corresponds to the natural frequency f, or natural frequency f2. It is possible to cause the main case 101 or the oscillatory actuator 115 of the PDA 10 to resonate even with such a frequency f0. Further, it should be understood that the waveform of the drive signal is not limited to the SIN wave illustrated in FIG. 4 but may also be a square wave, trapezoidal wave, WEST\ EXHIBIT PAGE 145

146 triangular wave, and the like. ); Ex at 29; Ex at [0273] ( For example, when a drag operation is under way, it is preferable to continuously give a weak vibration. ); Ex at 31-32; Ex at [0290] ( By adopting this configuration, when for example the user moves his or her fingertip in a state contacting but not pressing against the touch panel to find the position of a touch button, the PDA 10 transmits to the user vibration for the touch operation linked with the type of touch button contacted by the fingertip. That is, in the state where the user is searching the position of a touch button, for example, a weak vibration that differs for each type of touch button is transmitted to the user. On the other hand, when the user finds the desired touch button and presses against the touch button, the PDA 10 transmits to the user vibration for the pressing operation linked with the type of the touch button. That is, when the user presses against a touch button, vibration showing that the operation has been received is given to the user. ); Ex at 46; Ex at [0405] ( In the first embodiment, the memory 112 stores a plurality of types of waveform data. ); Figs , As further evidence supporting my opinion that Fukumoto discloses this limitation, see also Fukumoto WO at 22 ( In other words, the form of the vibration generated by the vibration actuator can be different depending on the execution results of the process. ), 28 ( The present embodiment describes an electronic apparatus that notifies the user that an operation input has been received, WEST\ EXHIBIT PAGE 146

147 using vibration with different forms based on the type of operation input ), Claim 21: The system of claim 12, wherein the lookup table comprises one or more of input device data, position data, pressure data, or function data Fukumoto discloses the lookup table comprises one or more of input device data, position data, pressure data, or function data As I have established above, Fukumoto discloses a processor configured to determine an interaction between the object contacting the touchsensitive surface and the graphical object. In particular, as part of my analyses directed to claim element 12f, which I incorporate here by reference, I showed that Fukumoto discloses determining at least a positional, pressure-level, function interaction between the object contacting the touch-sensitive surface and the graphical object As I described above, the positional determination of interaction is performed by the CPU 113, which compares the positional coordinates of the point of contact derived from the touch signal output by touch panel 102 to that of the area occupied by the graphical object. Fukumoto explains that this detection of positional correspondence is indicated to the user by one or more haptic effects which are generated in response to this detection As I further described above, the pressure-level determination of interaction is also performed by the CPU 113, which determines the pressure being applied to the surface at the location of the graphical object, and compares this to a WEST\ EXHIBIT PAGE 147

148 predetermined pressure threshold. Fukumoto explains that the touch signal generated by the touch panel 102 indicates not only the position of contact, but also indicates level of pressure being applied. The level of pressure is then compared to the predetermined pressure threshold to distinguish between a touch operation (a lower pressure contact) and that of a pressing operation (a higher pressure contact). Fukumoto explains that the detection of these different levels of pressure is indicated to the user by causing the device to respond with distinct haptic effects As I further described above, the movement and function determination of interaction is also performed by the CPU 113, which associates contact with displayed softkeys comprising graphical objects with functions such as select, delete, drag, etc., and thereby determines the function or motion interactions of the graphical object As I established above with respect to Figure 2 and related disclosure, the piezoelectric oscillatory actuator 115 is configured to receive an actuator signal delivered via drive signal generation circuit 114. Fukumoto describes that this actuator signal is generated at least in part by an interaction, including, for example, the positional and pressure-level interactions As explained by Fukumoto, the haptic effect that is generated in response to a touch operation is different based on the graphical object contacted by the user, or based on other interaction with the graphical object, such as the WEST\ EXHIBIT PAGE 148

149 position of the knob or pressure applied. This is shown, for example, in Ex at Figs. 32, 33, 42-48; [0259]-[0293]; Ex at Figs. 1-5, 32-48, 13-18, These interactions are associated with haptic effects in a lookup table, as I will discuss further below. Id. The signalss are received by the CPU, the CPU detects the interaction, and the CPU executing programmable code from memory generates the actuator signal based on the detected interaction and the waveform in the lookup table. Id.; Ex and Ex at Figs. 1-2, 5-6, 13-14, 17-18, and accompanying text As a first example of a lookup tablee Figure 33, shown below, comprises at least position data, and thus discloses this limitation: Ex & Ex at Figure Figure 33 shows a first lookup tablee corresponding to a PDA screen of WEST\ EXHIBIT PAGE 149

150 Figure 32, displaying Touch buttons labeled A, B, C... The lookup table of Figure 33 including entries for individual touch buttons, and their corresponding area or position on the screen: Next, FIG. 33 is a view illustrating a waveform data table 112a stored in the memory 112 in the PDA 10. As shown in the figure, the waveform data table 112a stores, for each touch button displayed on the screen, area data showing the area occupied by a touch button on the touch panel 102 using XY coordinates as well as waveform data of the drive signal to be applied to the oscillatory actuator 115 when that touch button is pressed. Ex at 31; Fukumoto at [0286], (emphasis added) Thus, as shown above, Fukumoto discloses a lookup table in Figure 33 that comprises at least position data, and thus discloses this limitation As a further example, Figure 46 of Fukumoto shows a lookup table corresponding to a PDA screen of Figure 32, displaying Touch buttons labeled A, B, C... The lookup table of Figure 46 includes entries for individual touch buttons, their corresponding area or position on the screen, and pressure-level classifications, i.e. touch operation (lower contact pressure) and pressing operations (higher contact pressure). Figure 46 is reproduced below: WEST\ EXHIBIT PAGE 150

151 Ex & Ex at Figure Fukumoto explains that the different waveform data used to produce the different actuator signals are stored in this table, and are retrieved based in part on the determined interactions corresponding to entries in this table, including both the positional and pressure-level interactions: FIG. 46 is a view illustrating a waveform data table stored in a memory in a PDA according to a seventh embodiment of the present invention. Ex at 11; Ex at [0103]. Further, the memory 112 of the PDA 10 according to the present embodiment stores the waveform data table 112d shown in FIG. 46. The waveform data table 112d corresponds to the example of the screen display of the touch buttons shown in FIG. 32. The waveform data table 112d stores for each touch button the area data for the touch button and the waveform data to be applied to the oscillatory actuator 115 for each of the cases when the touch button is subjected to a touch operation and when it is subjected to a pressing operation. WEST\ EXHIBIT PAGE 151

152 Ex at 31; Ex at [0288], (emphasis added). In a PDA 10 having this configuration, when a touch operation is performed on the touch panel, the touch panel outputs to the CPU 113 a touch signal showing that a touch operation has been performed. The CPU 113 finds the coordinate data of the touched position based on the touch signal and identifies the touch button operated referring to the waveform data table 112d. Next, the CPU 113 reads the waveform data for the touch operation linked with the identified touch button from the waveform data table 112d. The CPU 113 then drives the oscillatory actuator 115 using the drive signal generated by the read waveform data. The same is true for the case when a pressing operation is performed on the touch panel. The CPU 113 reads the waveform data for the pressing operation linked with the operated touch button from the waveform data table 112d and drives the oscillatory actuator 115. Ex at 31; Ex at [0289], (emphasis added) Thus, as shown above, Fukumoto discloses a lookup table in Figure 46 that comprises at least position data, and pressure data, and thus discloses this limitation As final example, Figures and accompanying text disclose a lookup table containing function data (operation instruction), such as select icon, drag, and delete data. This aspect of Fukumoto s system is described with respect to graphical icons being displayed on the PDA of Fukumoto and shown in Figures 38-39, which depict folder and trash icons, familiar to most users of computing devices. These figures are reproduced below: WEST\ EXHIBIT PAGE 152

153 354. As explained by Fukumoto, these figures depict a drag and drop operation, as would be used when deleting the folder by dropping it in the trash bin: FIG.38 and FIG.39 are views illustrating the state where a user is dragging an icon displayed on the display screen of the PDA 10 by a touch operation on the touch panel 102 to transfer it to the trash. Note that the trash spoken of here is an icon for nstructing deletion of data. Ex at 29; Ex at [0270], (emphasis added) Fukumoto describes operation instructions such as select, drag, delete data associated with this action: First, when a user selects the icon desired to be dragged by the touch operation on the touch panel 102, the CPU 113 of the PDA 10 detects the touched position and identifies the touch operation as an instruction for the selection of the icon. The memory 112 of the PDA 10 stores the waveform data table 112b, ass shown in FIG. 40, storing the waveform data of the drive signal to bee applied to the oscillatory WEST\ EXHIBIT PAGE 153

154 actuator 115 for each type of instruction designated by an operation input. Ex at 29; Ex at [0271] The lookup table referenced above with respect to this operation is shown in Figure 40 of Fukumoto, which is reproduced below: Ex & Ex at Figure As can be observed in this figure, a variety of different functional instructions corresponding to different actions are stored in the lookup table, including Enter, Cancel, Click, Drag, Select Icon, etc. I note that these correspond to the function data that are disclosed and claimed in the 356 patent as being in its lookup table. WEST\ EXHIBIT PAGE 154

155 358. Thus, as shown above, Fukumoto discloses a lookup table in Figure 40 that comprises at least function data, and thus discloses this limitation. Limitation 1.pre: A method, comprising: 359. Fukumoto discloses or renders obvious this preamble for the reasons I provided above with respect to the preamble of claim 12 and claim element [12c], which discussions I incorporate here by reference. Limitation 1.a: outputting a display signal configured to display a graphical object on a touch-sensitive input device; 360. Fukumoto discloses or renders obvious this limitation for the reasons I provided above with respect to claim element [12d], which discussion I incorporate by reference here. Limitation 1.b: receiving a sensor signal from the touch-sensitive input device, the sensor signal indicating an object contacting the touch-sensitive input device 361. Fukumoto discloses or renders obvious this limitation for the reasons I provided above with respect to claim elements [12a] and [12e], which discussions I incorporate by reference here. Limitation 1.c: determining an interaction between the object contacting the touch-sensitive input device and the graphical object; 362. Fukumoto discloses or renders obvious this limitation for the reasons I provided above with respect to claim element [12f], which discussion I incorporate by reference here. WEST\ EXHIBIT PAGE 155

156 Limitation 1.d: generating an actuator signal based at least in part on the interaction and haptic effect data in a lookup table Fukumoto discloses or renders obvious this limitation for the reasons I provided above with respect to claim element [12g], which discussion I incorporate by reference here. Claim 2: The method of claim 1, wherein the actuator signal is configured to cause a haptic effect to be output Fukumoto discloses or renders obvious this limitation for the reasons I provided above with respect to claim elements [12b], [12g] and [12h], which discussions I incorporate by reference here. Claim 3: The method of claim 1, wherein the actuator signal is generated when the object contacts the touch-sensitive device at a location corresponding to the graphical object Fukumoto discloses or renders obvious this limitation for the reasons I provided above with respect to claim 13, which discussion I incorporate by reference here. Claim 5: The method of claim 1, wherein the display signal is configured to display a keypad comprising a plurality of softkeys Fukumoto discloses or renders obvious this limitation for the reasons I provided above with respect to claim 15, which discussion I incorporate by reference here. Claim 7: The method of claim 5, wherein the plurality of softkeys comprises one softkey for each digit from 0 to 9. WEST\ EXHIBIT PAGE 156

157 367. Fukumoto discloses or renders obvious this limitation for the reasons I provided above with respect to claim 17, which discussion I incorporate by reference here. Claim 8: The method of claim 5, wherein the plurality of softkeys comprises the key configuration of a standard 101-key keyboard Fukumoto in view of Roysden renders obvious this limitation for the reasons I provided above with respect to claim 18, which discussion I incorporate by reference here. Claim 9: The method of claim 1, wherein the graphical object comprises a first graphical object and a second graphical object, the haptic effect comprises a first haptic effect and a second haptic effect, and wherein the first haptic effect is configured to be output when the object contacts the first graphical object, and the second haptic effect is configured to be output when the object contacts the second graphical object Fukumoto discloses or renders obvious this limitation for the reasons I provided above with respect to claim 19, which discussion I incorporate by reference here. Claim 10: The method of claim 1, wherein the haptic effect data comprises a plurality of haptic effects Fukumoto discloses or renders obvious this limitation for the reasons I provided above with respect to claim 20, which discussion I incorporate by reference here. Claim 11: The method of claim 1, wherein the haptic effect data comprises a plurality of haptic effects. WEST\ EXHIBIT PAGE 157

158 371. Fukumoto discloses or renders obvious this limitation for the reasons I provided above with respect to claim 21, which discussion I incorporate by reference here. Limitation 22.pre: A computer-readable medium comprising program code, comprising: 372. Fukumoto discloses or renders obvious this preamble for the reasons I provided above with respect to the preamble of claim 12 and claim element [12c], which discussions I incorporate here by reference. In particular, I note that Fukumoto s disclosure to a person of ordinary skill that a CPU 113 is running software from memory 112 to perform the claimed functions. Fukumoto at Fig. 2 and accompanying text. That software would comprise the program code for the limitations of this claim. Limitation 22.a: program code for outputting a display signal configured to display a graphical object on a touch-sensitive input device; 373. Fukumoto discloses or renders obvious this limitation for the reasons I provided above with respect to claim element [12d], which discussion I incorporate by reference here. Limitation 22.b: program code for receiving a sensor signal from the touchsensitive input device, the sensor signal indicating an object contacting the touch-sensitive input device; 374. Fukumoto discloses or renders obvious this limitation for the reasons I provided above with respect to claim elements [12a] and [12e], which discussions I WEST\ EXHIBIT PAGE 158

159 incorporate by reference here. Limitation 22.c: program code for determining an interaction between the object contacting the touch-sensitive input device and the graphical object; 375. Fukumoto discloses or renders obvious this limitation for the reasons I provided above with respect to claim element [12f], which discussion I incorporate by reference here. Limitation 22.d: program code for generating an actuator signal based at least in part on the interaction and haptic effect data in a lookup table, the actuator signal configured to cause a haptic effect to be output Fukumoto discloses or renders obvious this limitation for the reasons I provided above with respect to claim element [12g], which discussion I incorporate by reference here. Claim 23: The computer-readable medium of claim 22, wherein the actuator signal is generated when the object contacts the touch-sensitive device at a location corresponding to the graphical object 377. Fukumoto discloses or renders obvious this limitation for the reasons I provided above with respect to claim 13, which discussion I incorporate by reference here. Claim 25: The computer-readable medium of claim 22, wherein the haptic effect data comprises a plurality of haptic effects Fukumoto discloses or renders obvious this limitation for the reasons I provided above with respect to claim element 20, which discussion I incorporate by reference here. WEST\ EXHIBIT PAGE 159

160 Claim 26: The computer-readable medium of claim 22, wherein the lookup table comprises one or more of input device data, position data, pressure data, or function data Fukumoto discloses or renders obvious this limitation for the reasons I provided above with respect to claim 21, which discussion I incorporate by reference here. B. Tsuji 380. Japanese Patent Application Publication No , titled Information Display Device and Operation Input Device to Tsuji et al. was published on August 6, I understand from Apple counsel that Tsuji is 35 U.S.C. 102(b) prior art because it was published before the earliest alleged priority date on the face of the 356 patent Tsuji is directed toward information displays and input devices such as touch panels for factory automation equipment, automatic vending machines, automatic teller machines, home appliances and the like. Ex at [0001]. Tsuji explains that one of the issues concerning the use of touch panels in devices like those discussed above is that a touch panel can leave a user unsure of whether the user s input was received. Id. at [0003]. Visual and audible feedback, such as changing display colors or producing an audible sound, have been used to address this issue, but there are problems associated with such techniques. Id. at [0003]- [0004]. For example, visual feedback is sometimes obscured by a finger pressing WEST\ EXHIBIT PAGE 160

161 on the display, or is hard to perceive by users with poor eyesight, and audible feedback can be obscured in noisy environments or in environments with multiple devices (e.g., multiple vending machines) being used at the same time, and also can be hard to perceive by users with poor hearing. Id. at [0005]-[0006] In order to address these and other problems, Tsuji discloses a system with a touchpad in which feedback in the form of vibrations caused by piezoelectric vibrators or piezoelectric elements is provided to the user. Ex at [0014]. In some embodiments, an information display system includes a touch panel and a control circuit, which Tsuji discloses can be implemented by hardware or by software using a microcomputer. Id. at [0081]. The control circuit includes a position computing unit and operating force detecting unit, which detects a total force based on forces reported by individual piezoelectric elements. Id. at [0082]- [0085]. The control circuit also includes an operating force determining unit, which inputs the total force and also inputs four thresholds, and decides into which of five classifications (i.e., ranges) F0, F1, F2, F3, F4 the total force belongs based on the four thresholds. Id. at [0092]-[0097] When the total force belongs to ranges F1-F4, the control circuit selects a drive mode for vibrating the touch panel according to the following matrix of Fig. 13 reproduced below, in which R1-R6 represent region of the touchscreen such as a button and R0 represents no defined region (Ex at WEST\ EXHIBIT PAGE 161

162 [0087]-[0091]): Ex at Fig Ex at [0107]-[0108]. The symbols S11, S12 in the matrix select and designate any of the various drive modes such as those shown in Fig. 14, which vary by amplitude, frequency and length of time (other modes are expressly contemplated): WEST\ EXHIBIT PAGE 162

163 Ex at Fig In the section below, I will present my detailed invalidity analysis with respect to Tsuji, beginning with claim 12 of the 356 patent. Limitation 12.pre.: a system, comprising: 386. To the extent that the preamble is limiting, Tsuji discloses and/or renders obvious a system. In general terms, essentially any device comprising interconnected or otherwise intercommunicating functional elements or blocks WEST\ EXHIBIT PAGE 163

164 would represent a system, and Tsuji discloses various embodiments of such systems More particularly with respect to the devices in Tsuji, Tsuji discloses a range of systems, each comprising an information display device as one of the components: To provide an information display device with a small number of operating surface and display surface peripheral parts providing a reliable sense of operation and enabling a tracing operation without a push stroke. Ex at Abstract, (emphasis added) Tsuji further discloses that the information display device is combined with an input device, and may be utilized as part of a variety of different systems, including handheld terminals, ATMs, home appliances, and others: The present invention relates to an information display device and an operation input device used in, for example, factory automation (FA) devices, automatic vending machines, automatic ticket vending machines, automatic teller machines, home appliances, medical operating equipment, information equipment, handheld terminals, game devices, and the like. Ex at [0001] The systems disclosed in Tsuji combine the information display and input devices comprising a touch panel to provide touchscreen functionality, and Tsuji describes the advantages of such a combination: Devices where a touch panel is arranged on a display are in wide use as one type of information display device having an operation input WEST\ EXHIBIT PAGE 164

165 function. Touch panels are extremely thin, and have the advantage of providing a high degree of freedom for selecting an area that can be used as a switch. Ex at [0002] One of the embodiments of the system disclosed in Tsuji comprises an Automatic Teller Machine (ATM) having a touchscreen, as shown in Figure 1. I will refer to this as the ATM embodiment. This figure, along with accompanying description of the input and display functionality of its touchscreen, is reproduced below: Ex at Figure 1. FIG. 1 is a drawing illustrating an example of a system incorporating an information display device 100 of a first embodiment according to the present invention. Ex at [0040], (emphasis added). WEST\ EXHIBIT PAGE 165

166 FIG. 1 is a perspective view of an automatic teller machine (ATM) 1 as an example of a system incorporating an information display device 100 of a first embodiment of the present invention. The automatic teller machine 1 is provided with a cashier section 3 and a card and bank passbook insertion section 4 on a front surface of a chassis 2. The machine is also provided with an information input and output section 5, and the information display device 100 is used in the information input and output section 5. Ex at [0041], (emphasis added) Another embodiment of the system disclosed in Tsuji comprises a hand-held information display device having a touchscreen, as shown in Figures 17 and 18. I will refer to this embodiment as the PDA embodiment. These figures, along with accompanying description of the touchscreen, are reproduced below: Ex at Figures 17 & 18. FIG. 17 is a perspective drawing of the exterior of an information display device 200 according to a third embodiment of the present invention. WEST\ EXHIBIT PAGE 166

167 Ex at [0040]. The operating regions R1 through R4 displayed by the liquid crystal display panel are visible through the operating surface 11 in FIG. 17. Typically, these operating regions R1 through R4 are displayed along both sides of the operating surface. An operator grips the housing 201 by both sides, as illustrated by the broken lines in FIG. 17, and performs operations by pressing these operating regions R1 through R4 with his/her thumbs. When the position of this pressing operation is sensed, if the pressing force thereof is larger than a predetermined threshold, the operation input is accepted, a displayed object 210 (FIG. 18) on a screen changes, and the operating surface 11 is vibrated or slightly displaced based on a predetermined mode. The operation at this point is the same as in the first and second embodiments. Ex at [0146], (emphasis added) In addition, the touchscreen device that Tsuji discloses further includes mechanical actuators to provide haptic feedback in the form of vibrations, and other mechanical displacements of the screen in response to user contact made during an input operation: An operating panel 10 is arranged on a liquid crystal display panel 20, and the operating panel 10 is supported by piezoelectric elements E1 through E4. Pressing an operating surface 11 of the operating panel 10 with a finger generates voltage at both ends of the piezoelectric elements E1 through E4, and an operating force and an operating position are sensed by detecting and calculating said voltage. High frequency is applied to the piezoelectric elements E1 through E4 when an operating force larger than a predetermined threshold is sensed, which thus causes the operating surface 11 to vibrate. This vibration allows an operator to obtain a reliable sense of operation. The number of parts is small because the sensing of the operating force applied to the operating surface and the application of the vibration to the operating surface 11 are performed using the common piezoelectric elements E1 through E4. Furthermore, a tracing operation is made possible because an operating force smaller than the predetermined threshold causes no reaction. WEST\ EXHIBIT PAGE 167

168 Ex at Abstract, (emphasis added). In response to the first object described above, the present invention uses a mechanical reaction such as a vibration or a small displacement, and the like, of an operating surface as a response to an operation input from a device side. For example, the operating surface can be vibrated using a piezoelectric element (that is, a piezoelectric vibrator or a piezo element) to thus give an operator a reliable sense of operation. Ex at [0014], (emphasis added) Tsuji also discloses that the force applied to the touchscreen during contact can be classified and subjected to a threshold determination, thereby allowing different force thresholds to be applied with respect to the input device haptic feedback operation. A functional block diagram of the circuitry used for processing signals to determine the region of contact, as well as force classification, is shown in Figure 7, which is reproduced below along with accompanying description: WEST\ EXHIBIT PAGE 168

169 Ex at Figure 7. By the way, as is illustrated in FIG. 7, the region determining signal SR from the region determining unit 52 is also input to the operating force classification storage unit 55. As has already be mentioned, this is to enable changes to the values of the thresholds Fh1 through Fh4 based on the in-operation region R. Specifically, a plurality of combinations of the thresholds Fh1 through Fh4 are input to and stored in the operating force classification storage unit 55 from the information processing unit 60 based on the screen being displayed at a given time, and the threshold of one of these combinations is selected based on a region classifying signal R. Therefore, if the threshold for the operating force F is changed for every in-operation region R (or for the operating position at a given time) in this way, the operating force determination in the operating force determining unit 54 can be performed after the region determining signal SR is generated from the region determining unit 52. This can be achieved by, for example, delaying the timing of the operation of the comparing and determining unit 54a in FIG. 12 a very short period of time with respect to the operation timing of the region determining unit 52, or by inserting a delay circuit in front of the comparing and determining unit 54a. WEST\ EXHIBIT PAGE 169

170 Ex at [0103] Tsuji also discloses piezoelectric films may be used in addition to the piezoelectric actuators described above for sensing operations in the disclosed systems. This is described, for example, with respect to Figure 23, which is reproduced below, with accompanying description: Ex at Figure 23 (touch panel 10A [sic]). Ceramic piezoelectric elements and piezoelectric films, and the like, may also be used in cases where piezoelectric elements are used. FIG. 23 is a partial view illustrating an example using a piezoelectric film 310. In this example, a piezoelectric film 310 is arranged under the operating panel 10 or the touch panel 10T near each of the four corners thereof, and these piezoelectric films 310 are each supported by an elastic body 311 such as a spring, a piece of rubber, or the like. Screen display is performed using a liquid crystal display panel (not illustrated) arranged under the operating panel 10 or the touch panel 10T, just as in the embodiments described above. When an operator presses a desired location on the operating panel 10 or the touch panel WEST\ EXHIBIT PAGE 170

171 10T, the elastic bodies 311 contract and voltage is generated on both sides of the piezoelectric film corresponding to a related pressing force and pressing location, and then a pressing force and a pressing location are detected through the detection of this voltage. Ex at [0161] It is important to note that each of the disclosed embodiments, while directed to various applications, is expressly taught to be variations of the same basic embodiment. For example, Figures 1-14 and their accompanying text are expressly labelled a first embodiment, wherein a small touchscreen panel is included within a larger ATM machine. A second embodiment includes Figures and accompanying text as a replacement for the touch panel of Figure 3, and Tsuji discloses everything else is the same as the first embodiment with only the specified differences. See, e.g., [0133], [0137]-[0138], [0143]-[0144]. Similarly, a third embodiment comprising a PDA-type handheld device includes Figures and accompanying text, and Tsuji expressly discloses that everything else is the same as the first and second embodiments with only the specified differences. ( The operation at this point is the same as in the first and second embodiments. Ex at [0146]) Another embodiment of the system disclosed in Tsuji includes Figures and accompanying text, and Tsuji again expressly discloses that everything else is the same as the first and second embodiments with only the specified differences. Ex at [0159]. I will refer to this embodiment as the Volume WEST\ EXHIBIT PAGE 171

172 Controller embodiment. Tsuji discloses two variations of the Volume Controller embodiment Other potential modifications are disclosed in Figure 23 and text from [0160] to [0177], and Tsuji expressly discloses that those modifications are applicable to the other embodiments. Ex at [0166] Thus, a person of ordinary skill in the art would understand these express disclosures as confirmation that the teachings of Tsuji can be taken as a whole as an anticipatory reference in the analysis below that is directed to the individual claim elements. Alternatively, it is my opinion that combining elements from all disclosed embodiments and modifications in Tsuji would have been obvious to a person of ordinary skill, as Tsuji expressly teaches that these embodiments and modifications are design choices applicable to all disclosed embodiments with disclosed advantages and disadvantages. Limitation 12.a: a touch sensitive input device configured to output a sensor signal indicating an object contacting the touch-sensitive input device 399. Tsuji discloses and/or renders obvious a touch sensitive input device configured to output a sensor signal indicating an object contacting the touchsensitive input device. I note that this element merely requires outputting a sensor signal indicating contact with the touch sensitive input device, and places no further limitations on the nature of that signal As I discussed above with respect to the preamble, which I WEST\ EXHIBIT PAGE 172

173 incorporate here by reference, the information display and input device disclosed in Tsuji in a variety of forms combines a display and touch panel, and therefore comprises a touch screen One of the embodiments of the system disclosed in Tsuji is an Automatic Teller Machine (ATM) having such a touchscreen, as is shown in Figure 1. This figure, along with accompanying description of the input and display functionality of its touchscreen, is reproduced below: Ex at Figure 1. FIG. 1 is a perspective view of an automatic teller machine (ATM) 1 as an example of a system incorporating an information display device 100 of a first embodiment of the present invention. The automatic teller machine 1 is provided with a cashier section 3 and a card and bank passbook insertion section 4 on a front surface of a chassis 2. The machine is also provided with an information input and output section 5, and the information display device 100 is used in the information input and output section 5. WEST\ EXHIBIT PAGE 173

174 Ex at [0041], (emphasis added) The information n display device 1000 of Figure 1 is shown in greater detail in Figure 2 of Tsuji, which is reproduced below: Ex at Figure Tsuji identifies Figure 2 as a view of the exterior of the information display device 100 at [0042], and describes it ass follows: In FIG. 2, the information display device 100 is provided with a substantially box-like housing 101, and the portion housed in this housing 1011 is dividedd mainly into a display operating unit DP facing an operator side, and a control circuit unit CT on the backside thereof. A substantially rectangular operating surface 11 is exposed on a main surface MS of the housing 101. The operating surface 11 is either transparent or semi-transparent, and thus contents displayed on an informationn display surface 21 (see FIG. 3) can be viewed through the operating surface 11. WEST\ EXHIBIT PAGE 174

175 Ex at [ ] Tsuji discloses that the ATM embodiment (and all embodiments) can have two variations of touch panel input sensors,, and corresponding control circuitry CT As referenced above, Figure 3 of Tsuji provides additional detail of the display and other components of the information display device shown in cross-section, while Figure 4 is see-through plan view off the display device. Ex at [0045]. Figures 3 and 4 are reproduced below, with annotation added in red font to Figure 3 to identify relevant numeric components: 3 Window LCD Display Panel Info. Display Surface Operating Panel (10) and Surface (11) Ex at Figure 3, annotation added. Piezoelectric Elements E1-E4 3 The identification of these numbered elements can be found in Tsuji at [0044]-[0046]. WEST\ EXHIBIT PAGE 175

176 Ex at Figure As shown in these figures, four piezoelectric elements E1-E4 are coupled to each of the four corners of the liquid crystal display panel 20. Tsuji describes this arrangement as follows: As illustrated in FIG. 4, four piezoelectric elements E1 throughh E4 are arranged, one adjacent to each of the four corners off the liquid crystal display panel 20. The piezoelectric elements E1 through E4 are unit function means serving as elements of bi-directionall function means able to convert mechanical actions in electrical signals in two directions. These piezoelectric elements E1 through E4 affixed to a bottom surface of the case 40 in FIG. 3, and the eight apex portions thereof provide support areas near the fourr corners of a transparent or semi-transparent operating panel 10. The operating panel 10 is, for example, a glass plate and acrylic plate, orr the like, having a substantially rectangular planar shape. Ex at [0046], (emphasis added). WEST\ EXHIBIT PAGE 176

177 407. In the first variation, the piezoelectric elements E1 through E4 as described above function as both sensors for sensing user contact with the touch panel, as well as actuators capable of generating haptic feedback in response to this contact: Moreover, the piezoelectric elements E1 through E4 in FIG. 3 are used in the device of the first embodiment as elements combining both sensing means for sensing whether a bank user has pressed any of the operating regions R1 through R7, and driving means for gently vibrating the operating panel 10 based on said pressing. Ex at [0048] A person of ordinary skill in the art would thus understand that the piezoelectric elements E1 through E4 represent sensors configured to output a sensor signal indicating an object contacting the touch sensitive input device, and therefore disclose this claim element Tsuji describes this sensing operation in greater detail in conjunction with figure 7, which is reproduced below: WEST\ EXHIBIT PAGE 177

178 Ex at Figure The left-hand side of figure 7 depicts touch panel 11 and piezoelectric elements E1 through E4 coupled to operation unit 51 and drive unit 75. When an object contacts touch panel 11, the piezoelectric elements E1 through E4 output a sensor signal in the form of element voltages, indicated as e k on the figure: In FIG. 7, the element voltages e k (K equals 1 to 4) of each of the piezoelectric elements E1 through E4 coupled with the operating panel 10 are applied in parallel to an operation unit 51. Ex at [0082] Figure 8 illustrates the internal configuration of the operation unit 51 referenced with respect to Figure 7, and is shown below: WEST\ EXHIBIT PAGE 178

179 Ex at Figure Tsuji discloses that element voltages e k as shown as an input signal on the left side of Figure 8 are output by the piezoelectric elements E1-E4 when a force is applied to these elements, as would occur when an object contacts the touch panel. These voltage signals are provided to position calculating unit 51b and operating force calculating unit 51d via signal converter 51a, as shown in Figure 8 above. Tsuji describes this aspect of operation as follows: FIG. 8 illustrates the internal configuration of the operation unit 51. A numerical relationship between the force applied to the piezoelectric elements E1 through E4 and the terminal voltage is preset in a signal converter 51a inside the operation unit 51. The terminal voltages e k of WEST\ EXHIBIT PAGE 179

180 the piezoelectric elements E1 through E4 are each converted by the signal converter 51a into signals Sfk for expressing forces fk (k = 1 to 4) applied to the piezoelectric elements E1 through E4, and these signals Sfk are applied in parallel to a position computing unit 51b and an operating force detecting unit 51[d]. 4 Ex at [0083] A person of ordinary skill in the art would thus understand that the voltage signal e k that is output by piezoelectric elements E1 through E4 in response to contact on the touch panel discloses a sensor signal indicating an object contacting the touch sensitive input device. Thus, the touch-sensitive device disclosed in Tsuji therefore discloses this claim element, for at least the reasons set forth above Alternatively, the additional signals comprising the operating position signal SP and/or the operating force signal SF output by position calculating unit 51b and operating force calculating unit 51d (which are derived from the element terminal voltages e k) also disclose the a sensor signal indicating an object contacting the touch sensitive input device. Tsuji describes, for example, how the operating position signal SP is at a non active level when the operator of the device is not contacting any part of the operating surface 11: Accordingly, a signal SR that classifies operating regions R1 through R6 in order to express whether to denote an operating region R1 through R7 or the non-operating region R0, is output from the 4 The translated document contains a typographical error, identifying the operating force detecting unit as 51c rather than 51d. A person of ordinary skill in the art would readily recognize from a cursory review of figure 8 that the numeric label corresponding to the operating force detecting unit is 51d. WEST\ EXHIBIT PAGE 180

181 comparing and determining unit 52a in FIG. 10. Note that when the operator is not touching any part of the operating surface 11, the operating s position signal SP is considered to be at a non-active level and, based on this, a region determining signal SR is also considered to be at a non active level. To distinguish the plurality of regions R1 through R0 and non active levels, region determining signal SR is assumed to be a multilevel signal caring a plurality of bits. Ex at [0091], (emphasis added) Thus, the touch-sensitive device disclosed in Tsuji discloses this claim element, for at least this additional reason Tsuji discloses a second variation of touch sensors and control circuitry CT that is depicted in Figures 15-16, and described at paragraphs [0133] [0144]. In this second variation, the touch panel is a resistive type touch panel that uses transparent electrodes on the display surface to perform the function of sensing contact and providing position information, rather than processing the piezoelectric element voltages for this purpose Figure 15 of Tsuji provides a detailed view of the components and configuration of this second variation, including the resistive touch panel 10T, display and other components of the information display device shown in crosssection. Tsuji at [0133]-[0134]. Figure 15 is reproduced below, with annotation added in red font to identify relevant numeric components: 5 5 The identification of these numbered elements can be found in Tsuji at [0134] and [0137], the latter paragraph further indicating that the remaining configuration of the display operating unit DP in FIG. 15 is the same as that in FIG 3... WEST\ EXHIBIT PAGE 181

182 Window Touch Panel 10T and Surface 11 LCD Display Info. Display Panel 20 Surface 21 Piezoelectric Elements E1-E4 Ex at Figure As described above, Tsuji discloses that the touch panel 10T can be of a resistive design, which was a type of touch panel that was well known to those of skill in the art at the time of the alleged invention, as described previously in this report. Resistive touch panels provide position informationn directly, and thus signals output by this touch panel when contacted by an object represent an additional disclosure of claim element 12a. Tsuji describess the operation of this resistivee touch panel as follows: In FIG.15, the display operating unit DP allows an operator to specify an operating position using a touch panel 10T. The touch panel 10T is, for example, a resistive type panel having transparent electrodes arranged in an orthogonal matrix of M rows and N columns in an XY plane on a transparent t substrate. With this panel, the intersections of these rows and columns form switches, each cell of the matrix is considered to be a unit, and XY direction operating position signals are output. Ex at [0134]. WEST\ EXHIBIT PAGE 182

183 419. Tsuji further discloses that, alternatively, the touch panel 10T may instead be an optical, acoustic, or capacitive touch panel, each of which types of touch panels were well known at this time: The touch panel 10T is not limited to a resistive type panel and thus may also be a, (1) photoelectric touch panel for detecting an operating position where data light from a light-emitting element incident on a light receiving element is blocked or attenuated by a finger, or the like, (2) an ultrasonic touch panel for detecting an operating position where ultrasonic waves are emitted and ultrasonic wave oscillating element that enter a geophone element are blocked or attenuated by a finger, or the like, or (3) a capacitive touch panel for detecting a position touched by a finger, or the like, based on a change in capacitance, or the like. Ex at [0135], (emphasis added) Like resistive touch panels, optical, ultrasonic, and capacitive touch panels disclosed in Tsuji also provide position information directly, and thus signals output when contacted by an object using these alternative sensing technologies would represent an additional disclosure of claim element 12a This aspect of their function is further described in Tsuji with respect to Figure 16, which is a block diagram of the corresponding control circuit (CT) used in conjunction with the touch screen display panel of Figure 15. Figure 16 is shown below: WEST\ EXHIBIT PAGE 183

184 Ex at Figure Tsuji notes thatt most of the elements of the control circuit CT in FIG. 16 have the same configuration and function as an FIG.7... Ex at [0138] 423. Theree are some differences in the control circuit CT of Figure 16 as compared to that of Figure 7, however, which arise from the use of the resistive, capacitive, optical or ultrasonic touch panel for position sensing, rather than processing of the e k signals. For example, Tsuji describes the summing of the four ek signals in operation unit 51F to output a total force signal, rather than processing them individually to obtain position information, as was done in the first embodiment : WEST\ EXHIBIT PAGE 184

185 On the other hand, while the terminal voltages e k (K = 1 to 4) of the piezoelectric elements E1 through E4 are each applied in parallel to an operation unit 51F, the operation unit 51F is equivalent to a unit where the position calculating unit 51b has been omitted from the configuration in FIG. 8. That is, because the touch panel 10T specifies the operating position in the second embodiment, all that has to be calculated from the output voltage of the piezoelectric elements E1 through E4 is the total operating force F. Ex at [0141] Thus, a person of ordinary skill in the art would therefore understand that the voltage signal e k that is output by piezoelectric elements E1 through E4 represent in response to contact on the touch panel discloses a sensor signal indicating an object contacting the touch sensitive input device, just as with the first embodiment Alternatively, the additional signal SP output from the operating position specification 51T and/or the force signal SF output by operating unit 51F, also disclose a sensor signal indicating an object contacting the touch sensitive input device. Thus, the second embodiment of the touch-sensitive device disclosed in Tsuji discloses this claim element, for at least this additional reason Tsuji describes the advantages offered by the use of resistive, capacitive, optical and ultrasonic touch panels as improved position-sensing accuracy, and easier specification of operating regions at paragraphs [0143]- [0144] The PDA embodiment of the system disclosed in Tsuji comprises a WEST\ EXHIBIT PAGE 185

186 hand-held information display device having a touchscreen, as shown in Figures 17 and 18. These figures, along with accompanying description of the touchscreen, are reproduced below: Ex at Figure 17. Ex at Figure FIG. 17 is a perspective drawing of the exterior of an information display device 200 according to a third embodiment of the present invention. Tsuji at [0145]. The operating regions R1 through R4 displayed by the liquid crystal display panel are visible through the operating surface 11 in FIG. 17. Typically, these operating regions R1 through R4 are displayed along both sides of the operating surface. An operator grips the housing 201 by both sides, as illustrated by the broken lines in FIG. 17, and performs operations by pressing these operating regions R1 through R4 with his/her thumbs. When the position of this pressing operation is sensed, if the pressing force thereof is larger than a predetermined threshold, the operation input is accepted, a displayed object 210 (FIG. 18) on a screen changes, and the operating surface 11 is vibrated WEST\ EXHIBIT PAGE 186

187 or slightly displaced based on a predetermined mode. The operation at this point is the same as in the first and second embodiments. Ex at [0145], (emphasis added) Tsuji discloses that the operation of the touchscreen touch panel, display, and control circuit operate and are configured like those described with respect to the prior two embodiments: FIG 17 is an external perspective view of an information display device 200 according to a third embodiment of the present invention. The information display device 200 is a liquid crystal display type game device given is one example of a mobile information display device. The information display device 200, the operating surface 11 is exposed on a main surface MS of a box shaped housing 201. This operating surface is equivalent to the operating panel 10 in FIG three, or the touch panel 10 T in FIG 15. The display operating unit and control circuit behind the operating surface 11 are configured in the same way as the display operating unit DP in the first embodiment and the second embodiment. Ex at [0145], (emphasis added) Thus, the PDA embodiment discloses a touch sensitive input device configured to output a sensor signal indicating an object contacting the touch sensitive input device for the same reasons as discussed above with respect to the first and second embodiments of Tsuji In the Volume Controller embodiment disclosed by Tsuji, Tsuji discloses that the processor is configured to output a display signal displaying graphical objects (sliders 301, 305, buttons 304H, 304L) on the touchscreen. Ex. 1111, Figs (graphical objects); 22 (showing display signal from processor WEST\ EXHIBIT PAGE 187

188 60 to display driver 71 and LCD 20); [ ]. Ex at Figure 20. Ex at Figure Tsuji discloses the control circuit CT is modified slightly to as described in Figure 22 and accompanying text, which I will discuss in more detail below. Other than that modification, Tsuji discloses that the operation of the touchscreen touch panel, display, and control circuit operate and are configured like those described with respect to the prior two embodiments. Ex at [0159]. Limitation 12.b: an actuator coupled to the touch-sensitive input device, the actuator configured to receive an actuator signal and output a haptic effect to the touch-sensitive surface based at least in part on the actuator signal; 433. As I will describe below, Tsuji discloses an actuator coupled to the touch-sensitive input device, the actuator configured to receive an actuator signal and output a haptic effect to the touch-sensitive surface basted at least in part on the actuator signal. WEST\ EXHIBIT PAGE 188

189 i. Tsuji discloses an actuator coupled to the touchsensitive input device: 434. As I discussed above in my analysiss of claim element 12a, Tsuji discloses piezoelectric elements E1 - E4 that are coupled to touchscreen in each of the embodiments disclosed. These can be observed, for example, in Figures 3 and 4 depicting the first embodiment, shown with annotation below: Window LCD Display Panel Info. Display Surface Operating Panel (10) and Surface (11) Ex at Figure 3, annotations added. Piezoelectric Elements E1-E4 WEST\ EXHIBIT PAGE 189

190 Piezoelectric actuators coupled to the touchscreen Piezoelectricc actuators coupled to the touchscreen Tsuji at Figure 4, annotations added In this first embodiment, the piezoelectric actuators are electrically coupled to the touch sensitive input device via drive unit 75, through which an actuator signal is delivered, as shown in Figure 7: WEST\ EXHIBIT PAGE 190

191 Ex at Figure 7, with annotation added (Drive Unit 75 in red) Tsuji also discloses piezoelectric actuator elements (E1 through E4) that are coupled to the touchscreen in the second embodiment, in which resistive, capacitive, optical, or ultrasonic touch panels are utilized. These can be observed, for example, in Figure 15 depicting the second embodiment, shown with annotation below: WEST\ EXHIBIT PAGE 191

192 Window Touch Panel 10T and Surface 11 LCD Display Info. Display Panel 20 Surface 21 Piezoelectric Elements E1-E4 Ex at Figure 15, with annotations added In this second embodimen nt, the piezoelectric actuators are electrically coupled to the touch sensitive input device via drive unit 75, through which an actuator signal is delivered, as shown in Figure 16: WEST\ EXHIBIT PAGE 192

193 Ex at Figure 16 with annotation added (Drive Unit 75 in red). ii. Tsuji discloses the actuator configured to receive an actuator signal and output a haptic effect to the touch-sensitive surface based at least in part on the actuator signal 438. As I showed above with respect to Figures 7 and 16, the piezoelectric actuators are configured to receive an actuator signal delivered via drive unit 75. Tsuji describes how this actuator signal causes the piezoelectric actuator elements E1-E4 to generatee a haptic effect, such as a vibration: High frequency is applied to the piezoelectric elements E1 through E4 when an operating force larger than a predeterminedd threshold is sensed, which thus causes the operating surface 11 to vibrate. This WEST\ EXHIBIT PAGE 193

194 vibration allows an operator to obtain a reliable sense of operation. The number of parts is small because the sensing of the operating force applied to the operating surface and the application of the vibration to the operating surface 11 are performed using the common piezoelectric elements E1 through E4. Ex at Abstract Tsuji also discloses that in addition to a vibrational haptic effect, other haptic effects may be provided, such as pulsed displacement: In FIG. 7, the drive mode parameter signal V output from the drive mode selecting unit 72 is applied to the piezoelectric element drive unit 75. The piezoelectric element drive unit 75 has a high frequency oscillation circuit 76, which transmits a high frequency wave specified by the parameter signal V to the piezoelectric elements E1 through E4. Thus, the piezoelectric elements E1 through E4 are vibrated or slightly displaced at a specified amplitude and timing. Ex at [0122], (emphasis added). This is felt by the user because it causes the entire touch panel to vibrate. Ex at [0123], (emphasis added). When one drive mode is selected using the region determining signal SR and the operating force determining signal FB in this way, the parameter values that specify drive modes in FIG. 14 are read from the drive mode storage unit 73, and then applied to a piezoelectric element drive unit 75 in FIG. 7. In response to this, a vibration voltage is applied to the piezoelectric elements E1 through E4 such that the piezoelectric elements E1 through E4 vibrate or are displaced slightly, and this vibration or slight displacement is propagated to the operating surface 11. When the operator presses down on one of the operating regions R1 through R7 with at least a predetermined force, a tactile action intended to notify the operator that the operation was received is generated by a vibration or slight sliding of the operating surface 11. Ex at [0113], (emphasis added). WEST\ EXHIBIT PAGE 194

195 440. Tsuji discloses that a variety of different drive modes may be defined, stored, and utilized to generate an actuator signall that is applied to the piezoelectric actuator elements E1-E4. Figure 14 of Tsuji depicts examples of actuator signal waveforms corresponding to different drive modes that may be stored in Drive mode storage 73 and utilized to drive the actuators, and is reproducedd below: Ex at Figure Examples of actuator signals disclosed by Tsuji and depicted in Figure 14 include continuous vibrations of selectable amplitudes and frequencies, as well as short bursts of vibrations and discrete pulses. This is described by Tsuji as WEST\ EXHIBIT PAGE 195

196 follows: FIG. 14 schematically illustrates a variety of drive mode stored in a drive mode storage unit 73. For example, FIG. 14(a) illustrates a mode for performing a continuous vibration having a small amplitude, while FIG 14(b) is a vibration mode having a large amplitude. FIG. 14(c) illustrates a vibration mode having a different frequency than FIG. 14(a) and (b), while FIG 14(d) and (e) illustrate examples where vibration is performed for a short period of time once or twice. Furthermore, FIG. 14(f) is a vibration mode for applying only a single vibration (one-shot pulse). Note that examples of other modes are described below. Ex at [0110] These and other drive modes are used to provide an actuator signal to the piezoelectric actuator elements which cause a corresponding haptic effect to be output to the touch-sensitive surface, as further described by Tsuji below: When one drive mode is selected using the region determining signal SR and the operating force determining signal FB in this way, the parameter values that specify drive modes in FIG. 14 are read from the drive mode storage unit 73, and then applied to a piezoelectric element drive unit 75 in FIG. 7. In response to this, a vibration voltage is applied to the piezoelectric elements E1 through E4 such that the piezoelectric elements E1 through E4 vibrate or are displaced slightly, and this vibration or slight displacement is propagated to the operating surface 11. Ex at [0113], (emphasis added). On the other hand, the drive mode selecting unit 72 in FIG. 7, which inputs the region determining signal SR and an operating force determining signal FB, selects a drive mode based on the classifications of the in-operation region and the operating force. The drive mode defines how the operating surface 11 will be vibrated. Ex at [0108], (emphasis added). WEST\ EXHIBIT PAGE 196

197 Limitation 12.c: a processor in communication with the sensor and the actuator, the processor configured to: 443. Tsuji discloses and/or renders obvious a processor in communication with the sensor and the actuator For example, Figure 7 shows a control circuit CT of the first embodiment, comprising one or more processors in electrical communication with piezoelectric elements E1-E4: Ex at Figure As I described with respect to claim element 12a above, which discussion I incorporate here by reference, piezoelectric elements E1-E4 in the first embodiment function as a sensor to output a sensor signal indicating an object contacting the touch panel. As I described with respect to claim element 12b WEST\ EXHIBIT PAGE 197

198 above, which discussion I incorporate here by reference, these same piezoelectric elements also function as the actuators to output a haptic effect to the touch panel surface A person of ordinary skill in the art would understand that control circuit CT depicted in Figure 7 above performs the functions of a processor, and could be implemented in hardware, or with software executed by a microcontroller. This is expressly confirmed by Tsuji, in which he states that control circuit CT can be realized with software using a microcomputer: Next, the configuration and operation of the control circuit unit CT (FIG. 7) of the information display device 100 will be described based on the principles described above. Note that while an example where the control circuit unit CT is configured using a hardware circuit is illustrated here, this circuit can also be realized with software using a microcomputer. In this case, the following circuit portions are functionally realized using the MPU and memory of said microcomputer. Ex at [0081], (emphasis added) The following circuit portions that Tsuji references above that are implemented by an MPU running software instructions in the memory of the microcomputer would comprise items 51-57, 60-62, in Figure 7, and would be in communication via signal lines with the piezoelectric sensor/actuators. The MPU would be configured to display a graphical object on the touch sensitive input device by using display driver 71 to drive LCD 20, which will display a graphical object on the display that is visible through the operating panel/surface WEST\ EXHIBIT PAGE 198

199 10/ With respect to the second embodiment in which the sensor comprises a resistive, capacitive, optical or acoustic touch panel, Figure 16 shows a control circuit CT of the second embodiment, comprising one or more processors in electrical communication with piezoelectric elements E1-E4 and touch-sensitive panel 10T having surface 11: Ex at Figure As I described with respect to claim element 12a above, which discussion I incorporate heree by reference, the sensor comprises a resistive, WEST\ EXHIBIT PAGE 199

200 capacitive, optical or acoustic touch panel 10T having a surface 11, as shown. 6 As I described with respect to claim element 12b above, which discussion I incorporate herein by reference, the piezoelectric elements E1-E4 function as the actuator to output a haptic effect to the touch panel surface A person of ordinary skill in the art would understand that control circuit CT depicted in Figure 16 above performs the functions of a processor, and could be implemented in hardware, or with software executed by a microcontroller. This is expressly confirmed by Tsuji, in which he states that the functions of control circuit CT shown in Figure 16 can, just like that of Figure 7 of the first embodiment, be implemented using software: FIG. 16 is a configuration diagram of a control circuit CT using the display operating unit DP in figure 15, and while the circuit is described as a hardware circuit, as was the circuit in FIG. 7, the functions of these circuits can be expressed using software. Ex at [0138], (emphasis added) Thus, for the same reasons set forth above with respect to Figure 7, the functional circuits in Figure 16 that can be implemented by an MPU running software instructions in the memory of the microcomputer would comprise items 51F/T, 52-57, 60-62, and in Figure 16, and would be in communication via signal lines with the piezoelectric sensor/actuators. The MPU would be configured 6 As I described in my discussion of the second embodiment with respect to the sensor signal limitation of claim element 12a, either and/or both of the signals generated by the piezoelectric elements E1-E4 and the resistive, capacitive, optical or acoustic touch panel disclose the sensor signal limitation. WEST\ EXHIBIT PAGE 200

201 to display a graphical object on the touch sensitive input device by using display driver 71 to drive LCD 20, which will display a graphical object on the display that is visible through the transparent touch panel 10T at surface Descriptions of some examples of these signal paths between the processor of control circuit CT and the sensor/actuator of Tsuji are as follows: A region determining signal SR allowed to pass through the gate circuit 56 is input to a first processing unit 61 inside the information processing unit 60. The first processing unit 61 performs information processing and generates control signals for each device unit based on the menu item selected by the operator, and conveys the facts of this action to an external device (for example, a host computer) as necessary. Ex at [0106]. In the case of information display where screens change in conjunction with the operational input of an operator, vibration is stopped after, for example, the operating panel 10 has been vibrated for exactly a predetermined amount of time. This can be accomplished by using a signal transmitting path from the information processing unit 60 to the drive mode selecting unit 74 to forcefully put the drive mode parameter signal V into an inactive level. Ex at [0125] As further evidence supporting my opinion that Tsuji discloses this limitation, see also: Ex at [0087], [ ], [ ], [0121], [0155]; Figure 7. Limitation 12.d: output a display signal configured to display a graphical object on the touch-sensitive input device; WEST\ EXHIBIT PAGE 201

202 454. Tsuji discloses that the processor outputs a display signal configured to display a graphical object on the touch-sensitive input device As described above with respect to claim element 12C, which discussion I incorporate herein by reference, a person of ordinary skill in the art would understand that control circuit CT depicted in Figures 7 and 16 perform the functions of a processor, and could be implemented in hardware, or with software executed by a microcontroller. This is expressly confirmed by Tsuji, in which he states that the functions of control circuit CT shown in these figures can be implemented by an MPU running software instructions in the memory of a microcomputer. Included among these functions are information processing unit 60 and display driver 71 which outputs a display signal configured to display a graphical object on the touch-sensitive input device using liquid crystal display panel 20. This is shown within the boxed areas of Figures 7, 16 and, 22, reproduced below with annotation: WEST\ EXHIBIT PAGE 202

203 Ex at Figure 7, annotation added. WEST\ EXHIBIT PAGE 203

204 Ex at Figure 16, annotation added. Ex at Figure 22, annotation added As shown above, Tsuji discloses outputting a display signal to a liquid crystal display panel in each of his disclosed embodiments. This display signal is configured to display a graphical object on the touch-sensitive input device. Some examples of the display screens showing displayed graphical objects as disclosed in Tsuji are discussed below As a first example (the ATM embodiment), Figure 4 depicts a menu screen displayed on display panel 20 as might be used for a banking application. A number of graphical objects comprising menu buttons can be observed: WEST\ EXHIBIT PAGE 204

205 Ex at Figure Tsuji describes the display of graphical objects with respect to that shown in Figure 4 as follows: While a variety of information can be displayed on the liquid crystal display panel 20, the example in FIG. 4 illustrates menus of a bank automatic cashier. Regions R1 through R7 displayed by these menus serve as operating regions for bank users. For example, when a bank user presses on region R1 displaying Deposit with his or her finger using at least a predetermined amount of force, the information display device 100, through an operation described below, senses that Deposit has been selected, notifies the bank host computer of this, and then assumes a state in which cash can be accepted. Furthermore, at the same time, the information display surface 21 changes to a screen displaying guidance and a new operating menu for accepting the cash. Note that the size and placement of these operating regions R1 through R7 can be set randomly. Furthermore, region R0 in FIG. 4 illustrates a region in the information display surface 21 that is outside the operating regions R1 through R7. Ex at [0047]. WEST\ EXHIBIT PAGE 205

206 459. Tsuji confirms that the output signal from the processor of control circuit CT that is configured to display these graphical objects is provided by display driver 71, as shown in Figures 7 and 16 discussed above: For example, when Withdrawal, which corresponds to region R2 in FIG. 3, is selected, the screen is switched to an input operation screen for withdrawing money by driving the liquid crystal display panel 20 through a display driver 71. Ex at [0106] Tsuji also explains that the display signal can be reconfigured to change the color of the button the user is merely touching but not selecting (Ex [0107]) and display a numeric keypad, for example, if the user selects the Withdrawal button. Ex at [0120] As a second example (the PDA embodiment) showing that the display signal is configured to display a graphical object on the touch-sensitive input device, Figure 17 depicts a display screen having defined operating regions R1 and R2 as might be used with a hand-held device, while Figure 18 depicts the display of a number of graphical objects comprising menu buttons and an image. These figures are reproduced below: WEST\ EXHIBIT PAGE 206

207 Ex at Figures 17 & Tsuji states that the control circuit CT and display operating unit of this device are the same as that described with respect to the first and second embodiments, and thus my analysis of those embodiments applies equally to this device: FIG. 17 is an external perspective view of an information display device 200 according to a third embodiment of the present invention. The information display device 200 is a liquid crystal display type game device given as one example of a mobile information display device. With the information display device 200, the operating surface 11 is exposed on a main surface MS of a box shaped housing 201. This operating surface is equivalent to the operating panel 10 in FIG. 3, or the touch panel 10T in FIG. 15. The display operating unit and control circuit behind the operating surface 11 are configured in the same way as the display operating unit DP in the first embodiment and the second embodiment. Ex at [0145], (emphasis added) Tsuji further describes with respect to these figures how displayed WEST\ EXHIBIT PAGE 207

208 graphical objects such as the image 210 in Figure 18 changes in response to user interaction with the device: FIG. 17 is a perspective drawing of the exterior of an information display device 200 according to a third embodiment of the present invention. Ex at [0040]. The operating regions R1 through R4 displayed by the liquid crystal display panel are visible through the operating surface 11 in FIG. 17. Typically, these operating regions R1 through R4 are displayed along both sides of the operating surface. An operator grips the housing 201 by both sides, as illustrated by the broken lines in FIG. 17, and performs operations by pressing these operating regions R1 through R4 with his/her thumbs. When the position of this pressing operation is sensed, if the pressing force thereof is larger than a predetermined threshold, the operation input is accepted, a displayed object 210 (FIG. 18) on a screen changes, and the operating surface 11 is vibrated or slightly displaced based on a predetermined mode. The operation at this point is the same as in the first and second embodiments. Ex at [0146], (emphasis added) As a third example (the Volume Controller embodiment) disclosed by Tsuji, Tsuji discloses that the processor is configured to output a display signal displaying graphical objects (sliders 301, 305, buttons 304H, 304L) on the touchscreen. Ex. 1111, Figs (graphical objects); [ ] (describing how sliders move and haptic effects are generated in response to interactions). WEST\ EXHIBIT PAGE 208

209 Ex at Figure 20. Ex at Figure As further evidence supporting my opinion that Tsuji discloses this limitation, see also: Ex at [ ], [0103], [ ], [0120], [0124], [0154]; Figures 7, 13, 16-18, 22. Limitation 12.e: receive the sensor signal from the touch-sensitive input device; 466. Tsuji discloses the processor receives the sensor signal from the touch-sensitive input device As described above with respect to claim elements 12c and 12d, which discussion I incorporate herein by reference, a person of ordinary skill in the art would understand that control circuit CT depicted in Figures 7 and 16 perform the functions of a processor, and could be implemented in hardware, or with software executed by a microcontroller. This is expressly confirmed by Tsuji, in which he states that the functions of control circuit CT shown in these figures can WEST\ EXHIBIT PAGE 209

210 be implemented by an MPU running software instructions in the memory of a microcomputer. Included among these functions are operation unit 51 of Figure 7, and operation unit 51F and operating position specification 51T of Figure 16, each of which comprises a function of the processor and receives the sensor signal from the touch-sensitive input device This is shown in Figures 7 and 16: Ex at Figure 7, annotation added. WEST\ EXHIBIT PAGE 210

211 Ex at Figure 16, annotation added As shown above, Tsuji discloses operation unit 51 of Figure 7, and operation unit 51F and operating position specification 51T of Figure 16, each of which receives a sensor signal from one or more sensors As further evidence supporting my opinion that Tsuji discloses this limitation, see also: Ex at [ ], [ ], [ ]; Figures 6, 16, 18, 24; claims 1 ( operating signal extracting means for extracting the electrical signal generated by the bi-directional function means through an operating force applied to the operating surface as an operating signal ), 2-4 ( position signal ) ), 7, 8, 10, See also the evidence and information cited for claim limitations 12.c WEST\ EXHIBIT PAGE 211

212 and 12.d, which is incorporated here by reference. Limitation 12.f: determine an interaction between the object contacting the touch-sensitive surface and the graphical object; 472. Tsuji discloses the processor determines an interaction between the object contacting the touch-sensitive surface and the graphical object A person of ordinary skill in the art would understand the claimed interaction between the object contacting the touch-sensitive surface and the graphical object to broadly encompass determination of a wide variety of possible modes of interaction. These would include at least those based on the relative position of the contacting object with respect to that of the graphical object on the display surface, and/or the level of force or pressure associated with the area of contact, among others As I will describe below, the control circuit CT comprising a processor performs processing which determines both the relative position of the contacting object with respect to that of the graphical object, as well as a determination of the level of force associated with the area of contact. It further classifies the level of force into categories based on force thresholds, such that this additional information pertaining to force levels may be used for determining which haptic effects are to be output to the surface in response to the contact. WEST\ EXHIBIT PAGE 212

213 i. Tsuji discloses a processor configured to determine an interaction 475. As described above with respect to claim elements 12c and 12d, which discussion I incorporate here by reference, a person of ordinary skill in the art would understand that control circuit CT depicted in Figures 7 and 16 perform the functions of a processor, and could be implemented in hardware, or with software executed by a microcontroller. This is expressly confirmed by Tsuji, in which he states that the functions of control circuit CT shown in these figures can be implemented by an MPU running software instructions in the memory of a microcomputer. Included among these functions are the region determining units 52 and operation force determining units 54 of Figures 7 and 16. Each of these determining units comprises a function of the processor, and receives signals generated by contact from the touch-sensitive input device via operating unit 51 (Figure 7 and 16) and operating position specification 51T (Figure 16) The control circuit CT comprising a processor, including region determining units 52 and operation force determining units 54, performs processing which determines both the relative position of the contacting object with respect to that of the graphical object, as well as a determination of the level of force associated with the area of contact. It further classifies the level of force into categories based on force thresholds, such that this additional information pertaining to force levels may be used for determining which haptic effects are to WEST\ EXHIBIT PAGE 213

214 be output to the surface in response to the contact. I will address each of these functions in turn, below. ii. Tsuji discloses determining a positional interaction between the object contacting the touch-sensitive surface and the graphical object 477. With respect to determining a positional interaction, Tsuji discloses that region determining unit 52 determines an interaction between the object contacting the touch panel 11 (or 10T) and a graphical object (signal SR indicating which icon R1 through R7 was contacted) The first step of this process is determining the location on the surface where an object such as a finger contacts the touch-sensitive surface. Tsuji discloses that the processor is configured to determine this location: Pressing an operating surface 11 of the operating panel 10 with a finger generates voltage at both ends of the piezoelectric elements E1 through E4, and an operating force and an operating position are sensed by detecting and calculating said voltage. Ex at Abstract, (emphasis added) With respect to the positional determination in the first embodiment, Tsuji describes the principles for sensing the operating position at paragraphs [0049]-[0080]: Before describing the rest of the configuration of this device, the principles by which the piezoelectric elements E1 through E4 are used to sense which of the operating regions R1 through R7 have been pressed will be described. WEST\ EXHIBIT PAGE 214

215 Ex at [0049] In summary, the procedures described at paragraphs [0049]-[0080] utilize the individual pressing force F signals f k from each of the piezoelectric elements E1-E4 to calculate the x and y coordinates of the point of contact P on the surface: Moreover, a case is imagined where the operating panel 10M is pressed downward at a position at point P (x and y) with a pressing force F. At this time, the principle for sensing (x and y), which are the XY coordinates of point P, using the function of the piezoelectric elements E1 through En, is as described below. Ex at [0051]. Distance constants a and b (see FIG. 6) saved in advance in a constant storage unit 51c are also applied to the position computing unit 51b, and the position computing unit 51b calculates the position coordinates (x and y) for an operating point using Number 16 and Number 17 described above. Ex at [0084] As described above for the first embodiment, the calculations presented at paragraphs [0049]-[0080] of Tsuji provide the XY coordinates of the point of contact P at which an object such as a finger contacts the display surface. The operating point referenced in the passage above comprises the point of contact in XY coordinates on the operating surface 10, and is represented by a signal SP output by the operation unit 51, shown in Figure 7: WEST\ EXHIBIT PAGE 215

216 Ex at Figure 7, annotation added The operating position signal SP representing the x and y coordinates of the point of contact of an object of the first embodiment as shown in Figure 7 above is described by Tsuji as follows: As a result, an operating position signal SP showing the operating position P (x and y) and an operating force signal SF showing the operating force F are both output from the operation unit 51. The operating position signal SP has two components (x and y). Ex at [0086], (emphasis added) With respect to the second embodiment, I previously described that Tsuji discloses use of a resistive, capacitive, optical or acoustic touch panel to determine the point of contact at which an object such as a finger contacts the display surface in my discussion of claim element 12a, which I incorporate here by WEST\ EXHIBIT PAGE 216

217 reference This embodiment is depicted in Figure 15, which is reproduced below with annotation added in red font to identify relevant numeric components: 7 Window Touch Panel 10T and Surface 11 LCD Display Info. Display Panel 20 Surface 21 Piezoelectric Elements E1-E4 Ex at Figure 15, annotation added Tsuji discloses that the touch panel 10T comprising, for example, a resistivee touch panel, will provide the XY coordinates of the point of contact between an object and the surface of the touch-sensitive device. This is described, for example, in the following passages: In FIG.15, the display operating unit DP allows an operator to specify an operating position using a touch panel 10T. The touch panel 10T is, for example, a resistive type panel having transparent electrodes arranged in an orthogonal matrix of M rows and N columns in an XY plane on a transparent t substrate. With this panel, the intersections of these rows and columns form switches, each cell of the matrix is 7 The identification of these numbered elements can be found in Tsuji at [0133] and [0137], the latter paragraph further indicating that the remaining configuration of the display operating unit DP in FIG. 15 is the same as thatt in FIG 3... WEST\ EXHIBIT PAGE 217

218 considered to be a unit, and XY direction operating position signals are output. Ex at [0134], (emphasis added). On the other hand, while the terminal voltages e k (K = 1 to 4) of the piezoelectric elements E1 through E4 are each applied in parallel to an operation unit 51F, the operation unit 51F is equivalent to a unit where the position calculating unit 51b has been omitted from the configuration in FIG. 8. That is, because the touch panel 10T specifies the operating position in the second embodiment, all that has to be calculated from the output voltage of the piezoelectric elements E1 through E4 is the total operating force F. Ex at [0141], (emphasis added) The operating position referenced in the passages above comprises the point of contact in XY coordinates on the touch panel 10T, and is represented by an operating position signal SP output by the operating position specification unit 51T, shown in Figure 16: WEST\ EXHIBIT PAGE 218

219 Ex at Figure 16, annotation added 487. Thus, as shown above, the embodiments of the touch-sensitive input device disclosed in Tsuji each determine the location on the surface where an object such as a finger contacts the touch-sensitive surface The next step in determining a positional interaction between the object contacting the touch-sensitive surface andd the graphical object being displayed is determining whether the point of contact represented by operating position signal SP is inside of a region corresponding to a displayed graphical object. Tsuji discloses that this function is performed by the control circuit processor functionn comprising region determinin ng unit 52, which he describes, for example, at paragraphs [0086]-[0092] under the heading, Operating Position WEST\ EXHIBIT PAGE 219

220 (Operating Region) Determination In this section Tsuji discloses that the operating position signal SP indicating the point of contact of the object in XY coordinates is applied to the region determining unit 52, which in turn determines whether the coordinates of the point of contact are inside or outside of one or more regions of the touch panel corresponding to a displayed graphical object. This is described, for example, in the following passages: The operating position signal SP obtained by the operation unit 51 is applied to a region determining unit 52 in accordance with FIG. 7. Information (xi-, xi+, yi-, yi+: where i=1 to 7) expressing the apex coordinates (see FIG. 9) of each of the operating regions R1 through R7 in FIG. 4 is input from a region classification storage unit 53 into the region determining unit 52. The information of these apex coordinates is loaded from an information processing unit 60 (FIG. 7) described below in accordance with the content displayed at the time the information is loaded. Ex at [0087], (emphasis added). The region determining unit 52 uses a comparing and determining unit 52a (FIG. 10) to compare the coordinates (x and y) of the operating point P to the apex coordinates of the operating regions R1 through R7 obtained as described above, and then determines whether the operating point P is in the region R0 or one of the operating regions R1 through R7. Ex at [0088], (emphasis added) As described above, the region determining unit 52 uses a comparing and determining unit 52a (FIG. 10) to compare the coordinates (x and y) of the operating point P to the apex coordinates of the operating regions R1 through R7 WEST\ EXHIBIT PAGE 220

221 obtained as described above, and then determines whether the operating point P is in the region R0 or one of the operating regions R1 throughh R7. Ex at [0088]. Figure 10 is reproduced below, and shows the comparing and determining unit 52a accepting the operating position signal SP and region classification unit signal 53 as input, determining which of regions R0-R7 the point of contact corresponds to. It then outputs the result of this determination process as the region determinin ng signal SR: Ex at Figure As Tsuji explains, a determination that the operating point P is in non-operating region R0 correspond ds to a determination that the point of contact is outside of any of the operating regions R1-R7 corresponding to displayed graphical objects that are operable, or selectable: : It is also determined whether or not the coordinates (x and y) of the operating point P are in the region (non-operating region) R0, which is WEST\ EXHIBIT PAGE 221

222 the region outside the operating regions R1 through R7 on the liquid crystal display screen. Ex at [0090] It is important to note that operating regions R1 - R7 correspond to displayed graphical objects that are selectable by the user by contacting the touchscreen with a finger in the region corresponding to the displayed object. Thus, the function of the processor of the region determining unit 52 discloses determining an interaction between the object contacting the touch-sensitive surface and the graphical object displayed on the liquid crystal display 20. In this particular example, the nature of the interaction which is determined is that of relative position between the contacting object and that of the displayed graphical object, which is one of the same types of interaction disclosed in the 356 patent In the ATM embodiment, the correspondence between one or more of the operating regions R0-R7 and graphical objects displayed on the touchscreen of Tsuji is disclosed, for example, with respect to Figure 4, which is reproduced below: WEST\ EXHIBIT PAGE 222

223 Ex at Figure This figure depicts a screen that might be displayed in conjunction with a banking application, and includes seven displayed graphical objects comprising soft keys representing different actions such as Deposit, Withdrawal, etc. The coordinates of each of the displayed graphical objects comprising selectable soft keys define the coordinates of the corresponding operating regions R1-R7, as indicated on the figure. It is these region coordinates that the comparing and determining unit 52a (FIG. 10) uses to compare the coordinates (x and y) of the operating point P to the apex coordinates of the operating regions R1 through R7, and to determine[] whether the operating point P is in the region R0 or one of the operating regions R1 through R7. Ex at [0088] WEST\ EXHIBIT PAGE 223

224 495. Tsuji describes the operation with respect to Figure 4 as follows: While a variety of information can be displayed on the liquid crystal display panel 20, the example in FIG. 4 illustrates menus of a bank automatic cashier. Regions R1 through R7 displayed by these menus serve as operating regions for bank users. For example, when a bank user presses on region R1 displaying Deposit with his or her finger using at least a predetermined amount of force, the information display device 100, through an operation described below, senses that Deposit has been selected, notifies the bank host computer of this, and then assumes a state in which cash can be accepted. Furthermore, at the same time, the information display surface 21 changes to a screen displaying guidance and a new operating menu for accepting the cash. Note that the size and placement of these operating regions R1 through R7 can be set randomly. Furthermore, region R0 in FIG. 4 illustrates a region in the information display surface 21 that is outside the operating regions R1 through R7. Ex at [0047], (emphasis added) Tsuji also discloses that an interaction is determined when a user touches, but does not select by pressing, a region, such that the region changes color. Ex [0107]. And Tsuji discloses that the regions can be remapped into a numeric keypad, for example, if the user selects the Withdrawal button. Ex at [0120] This same correspondence between operating regions R1-R7 and graphical objects displayed on the liquid crystal display 20 as selectable soft keys is described with respect to the hand-held device embodiment depicted in Figures 17 and 18 (the PDA embodiment). These figures, along with accompanying description of the touchscreen, are reproduced below: WEST\ EXHIBIT PAGE 224

225 Ex at Figure 17. Ex at Figure 18. FIG. 17 is a perspective drawing of the exterior of an information display device 200 according to a third embodiment of the present invention. Ex at [0040]. The operating regions R1 through R4 displayed by the liquid crystal display panel are visible through the operating surface 11 in FIG. 17. Typically, these operating regions R1 through R4 are displayed along both sides of the operating surface. An operator grips the housing 201 by both sides, as illustrated by the broken lines in FIG. 17, and performs operations by pressing these operating regions R1 through R4 with his/her thumbs. When the position of this pressing operation is sensed, if the pressing force thereof is larger than a predetermined threshold, the operation input is accepted, a displayed object 210 (FIG. 18) on a screen changes, and the operating surface 11 is vibrated or slightly displaced based on a predetermined mode. The operation at this point is the same as in the first and second embodiments. Ex at [0146]. WEST\ EXHIBIT PAGE 225

226 498. As can be observed in these figures and is described by Tsuji above, the operating regions R1-R4 shown in Figure 17 each have a shape and screen location that corresponds directly to the displayed graphical objects comprising selectable soft keys that are shown in Figure 18. Thus, when the processor, comprising in part comparing and determining unit 52a, compares the coordinates of the operating point P indicating the contact point to the apex coordinates of the operating regions to determine whether the operating point P is in the region R0 or is contacting one of the operating regions, this function discloses a processor configured to determine an interaction between the object contacting the touchsensitive surface and the graphical object displayed on liquid crystal display 20 of Tsuji Thus, for at least the reasons directed to a positional interaction as set forth above, Tsuji discloses a processor configured to determine an interaction between the object contacting the touch-sensitive surface and the graphical object In the Volume Controller embodiment in Figures 20-22, Tsuji discloses determining an interaction (selection and adjustment of sliders 301 and 305 and selection of buttons 304H and 304L) between the object (user s finger 303) contacting the touch-sensitive surface (Volume Controller touchscreen) and the graphical object (slides 301, 305, buttons 304H and 304L) In the first variation of the Volume Controller embodiment, as shown WEST\ EXHIBIT PAGE 226

227 in Figure 20 reproduced below, Tsuji discloses that the processor determines an interaction between the finger 303 and slider 301 based on the position of force applied directly to the slide 301 itself Specifically, the processor determines whether the finger 303 contacts slider 301 with sufficient pressing force F and, if so, the slider will be moved with the finger. The processor will determine an interaction (adjustment) to the Y coordinate of the slider on track 302. Tsuji at [ ]. The Y coordinate could be determined in terms of a continuous function or as discrete levels such as YH, YM, YL. Tsuji at [0154] In the second variation of the Volume Controller embodiment, as shown in Figure 21 reproduced below, Tsuji discloses that the processor determines an interaction between the finger 303 and slider 305 and buttons 304H and 304L. WEST\ EXHIBIT PAGE 227

228 Specifically, the processor determines whether the finger 303 contacts the touch screen at either push button display 304L (lower volume) or push button display 304H (raise volume) with sufficient pressure. Ex. 1111, [0158]. If so, Tsuji discloses that the processor determines an interaction (adjustment) of that contact with slider 305 by determining the Y coordinate of slide display 305 and adjusting it along adjusting line 302 in accordance with the finger 303 s contact of either the raise or lower button. Ex. 1111, Figs. 21, 22, [0155]-[0158]. Like the first variation of the Volume Controller, the determination of the interaction with WEST\ EXHIBIT PAGE 228

229 the Y position can be determined by the processor in continuous coordinates (YD) or discrete levels (YH, YM, YL). Ex. 1111, [ ] As discussed above with respect to the positional interaction, Tsuji discloses significant implementation details regarding how the processor is configured to make the above determinations based on the position and force of the contact much more detail than the 356 patent itself. Both Tsuji and the 356 patent disclose touchscreens comprised of two layers a display underneath a transparent touchpad. Tsuji at Fig. 3; 356 patent at Fig. 6. One of ordinary skill understands that the graphical object is displayed by output signals sent to the display portion, and the contacts result in input signals from the touchpad portion. The processor then determines whether the input signals correlate sufficiently with the displayed graphical objects to infer a selection of a button or other intended functionality of the user interface device As I will discuss next, Tsuji also discloses this limitation in the context of determining a force interaction between the object contacting the touchsensitive surface and a graphical object. iii. Tsuji discloses determining a force-level interaction between the object contacting the touch-sensitive surface and the graphical object 508. When an object such as finger contacts the touch-sensitive surface at a point inside of an operating region R corresponding to a graphical object as WEST\ EXHIBIT PAGE 229

230 described above, the processor comprising control circuit CT additionally determines and classifies the level of force being applied to the region of the displayed graphical object This is described with respect to the first embodiment in Tsuji at, for example, paragraphs [0092]-[0104], in a section entitled Determination of the Operating Force, as well as throughout the reference, as discussed below Tsuji describes this function in the following passage: On the other hand, in FIG. 7, an operating force signal SF indicating the operating force F is applied to an operating force determining unit 54. A plurality of thresholds Fh1 through Fh4 for defining the operating force classifications F0 through F4 in FIG. 11 are input from an operating force classification storage unit 55 into the operating force determining unit 54. The information of these thresholds Fh1 through Fh4 is also loaded at a given time from an information processing unit 60 described below, based on displayed content. Furthermore, while operating force classifications F0 through F4 have been defined for this example, the number of force classifications can be changed based on the displayed content at a given time. Ex at [0092], (emphasis added) In summary, the procedures described at [0092]-[0104] utilize the individual pressing force F signals f k from each of the piezoelectric elements E1- E4 to calculate an operating force signal SF: Moreover, a case is imagined where the operating panel 10M is pressed downward at a position at point P (x and y) with a pressing force F. At this time, the principle for sensing (x and y), which are the XY coordinates of point P, using the function of the piezoelectric elements E1 through En, is as described below. WEST\ EXHIBIT PAGE 230

231 Ex at [0051], (emphasis added). As a result, an operating position signal SP showing the operating position P (x and y) and an operating force signal SF showing the operating force F are both output from the operation unit 51. The operating position signal SP has two components (x and y). Ex at [0086], (emphasis added) As I described previously with respect to contact position determination in Tsuji, which I incorporate here by reference, the operation unit of Figure 7 and the operation specification unit 51of Figure 16 provide the XY coordinates of the point of contact P at which an object such as a finger contacts the display surface in the first and second embodiments. This operating point comprises the point of contact in XY coordinates on the operating surface 10, and is represented by a signal SP output by these units As discussed above, the operation units 51 of Figures 7 and 16 also output an operating force signal SF, which is provided as an input to an operating force determining unit 54. These aspects of the control circuit for each of the embodiments corresponding to Figures 7 and 16 are shown with annotation, below: WEST\ EXHIBIT PAGE 231

232 Ex at Figure 7, annotation added. Ex at Figure 16, annotation added. WEST\ EXHIBIT PAGE 232

233 514. The operating force determining unit 54 accepts as inputs both the operating force signal SF (indicating the magnitude of the force) and a signal from the operating force classification unit 55 definingg force thresholds Fh1-Fh4, and outputs the result of this determinationn as an operating force determining signal, FB: Furthermor re, a signal is output from each determining unit for the effective operating force classifications F11 through F4 to a drive mode selecting unit 72 in FIG. 7, as an operatingg force determining signal FB. This signal is used as information for selecting the operating surface 11 drive mode using the piezoelectric elements E1 through E4, based on what classification the operating force F is in. Ex at [0103], (emphasis added) The individual operating force determining units 54a within the operating force determining unit 54 are depicted in more detail in Figure 12, which is reproduced below: WEST\ EXHIBIT PAGE 233

234 Ex at Figure One function of this unit is to generate the operating force determining signal FB that indicates the level of applied force relative to defined force thresholds Fh1-Fh4. This function is described further in the following passage: On the other hand, in FIG. 7, an operating force signal SF indicating the operating force F is applied to an operating force determining unit 54. A plurality of thresholds Fh1 through Fh4 for defining the operating force classifications F0 through F4 in FIG. 11 are input from an operating force classification storage unit 55 into the operating force determining unit 54. The information of these thresholds Fh1 through Fh4 is also loaded at a given time from an information processing unit 60 described below, based on displayed content. Furthermore, while operating force classifications F0 through F4 have been defined for this example, the number of force classifications can be changed based on the displayed content at a given time. Ex at [0092] The relative relationship of the force thresholds Fh1 to Fh4 and force classifications F0-F4 as referenced above is depicted in Figure 11, reproduced below: WEST\ EXHIBIT PAGE 234

235 Ex at Figure As can be observed in this figure, force classifications F0 through F4 are determined based on where the level of the applied force (as represented by the operating force signal SF) falls with respect to the defined force thresholds, Fh1- Fh4. Thus, for example, if the operating force signal SF indicated a magnitude that is greater than Fh1 and less than Fh2, then the operating force determining signal FB would indicate a force classification of F This is described further in Tsuji below: With the four operating force classifications F1 through F4, in ranges at or above the minimum threshold Fh1, referred to as effective operating force classifications, the operating force determining unit 54 uses the comparing and determining unit 54a housed therein (FIG 12) to compare the operating force F, denoted at a time given by the operating force signal SF, to the operatingg force thresholds Fh1through h Fh4 to determine which of thee effective operating force classifications F1 through F4 the operatingg force F is in at a given time. WEST\ EXHIBIT PAGE 235

236 Ex at [0095]. For example, if [Number 19] is Fh1 F < Fh2, the operating force is determined to be pressing at effective operating force classification F1, and if [Number 20] is Fh4 F, the operating force is determined to be pressing at effective operating force classification F4. Ex at [ ] In addition, I note that the force thresholds Fh1-Fh4 can be changed based on the region in which the point of contact occurs. Thus, for example, different force classifications can be determined as a function of which region corresponding to graphical object comprising a selectable soft key is being pressed: Finally, the thresholds Fh1 through Fh4 can be changed based on which region the operating position P belongs to at a given time (hereinafter referred to as in-operation region R ). Accordingly, for example, the values of the thresholds Fh1 through Fh4 can be made small for the operating regions R1 through R6, and the values of the thresholds Fh1 through Fh4 can be made large for the operating region R7. These corresponding relationships are stored in advance in the information processing unit 60 in FIG. 7, and the specific method for changing the thresholds is described below. Ex at [0093], (emphasis added) Thus, as shown above, the embodiments of the touch-sensitive input device disclosed in Tsuji each determine the level of force being applied by an object such as a finger when contacting the touch-sensitive surface in a region corresponding to a displayed graphical object, such as a selectable soft key. This information is used to determine a force classification F0 to Fn, where F0 represents a non-active level response (i.e. it will not generate any effect on the WEST\ EXHIBIT PAGE 236

237 display and/or a generated haptic effect), and force classifications of F1-Fn represent progressively larger levels of force. This force classification, when generated by an object such as a finger contacting the region corresponding to a displayed graphical object such as a soft key, therefore discloses another example of determining an interaction... as set forth in limitation 12f Thus, for at least these additional reasons directed to a force-level interaction as set forth above, Tsuji discloses a processor configured to determine an interaction between the object contacting the touch-sensitive surface and the graphical object. Limitation 12.g: generate the actuator signal based at least in part on the interaction and haptic effect data in a lookup table; and 523. Tsuji discloses and/or renders obvious a processor configured to generate the actuator signal based at least in part on the interaction and haptic effect data in a lookup table As I have established above, Tsuji discloses a processor configured to determine an interaction between the object contacting the touch-sensitive surface and the graphical object. In particular, as part of my analyses directed to claim element 12f, which I incorporate here by reference, I showed that Tsuji discloses determining both a positional and a force-level interaction between the object contacting the touch-sensitive surface and the graphical object. WEST\ EXHIBIT PAGE 237

238 525. As I described above, the positional determination of interaction is indicated by the value of the region determining signal SR, which is an output of region determining unit 52 shown in Figures 7 and 16. In the embodiments described, this may assume a value of R0, indicating contact outside of any active operating region, or one of a value from R1 to R7, for example, indicating that the object is contacting the surface at a location that is within a region R1-7 corresponding to the area of a displayed graphical object As I further described above, the force-level determination of interaction is indicated by the value of the force determining signal FB, which is an output of operating force determining unit 54, as shown in Figures 7 and 16. In the embodiments described, the operating force signal FB may assume a force classification value of F0, indicating a level of force below an active level, or one of a force classification value from F1 to F4. Force classification values which occur in the range of F1-F4 when an object is contacting the surface at a location that is within a region R1-7 corresponding to the area of a displayed graphical object indicates that the object is pressing with a force sufficient to be active (i.e. greater than F0). Additionally, the level of force is further classified as indicated by the force classification value F1-F4, with the level of force indicated as progressively increasing with values F1 to F As I will describe in detail below, Tsuji discloses that the processor WEST\ EXHIBIT PAGE 238

239 will generate the actuator signal based at least in part on the interaction, and further, that Tsuji discloses and/or renders obvious that the processor will generate the actuator signal based at least in part on haptic effect data in a lookup table. I will address each of these in turn, below. i. Tsuji discloses that the processor is configured to generate the actuator signal based at least in part on the interaction As I showed above with respect to Figures 7 and 16, the piezoelectric actuators are configured to receive an actuator signal delivered via drive unit 75. Tsuji describes how this actuator signal causes the piezoelectric actuator elements E1-E4 to generate a haptic effect, such as a vibration: High frequency is applied to the piezoelectric elements E1 through E4 when an operating force larger than a predetermined threshold is sensed, which thus causes the operating surface 11 to vibrate. This vibration allows an operator to obtain a reliable sense of operation. The number of parts is small because the sensing of the operating force applied to the operating surface and the application of the vibration to the operating surface 11 are performed using the common piezoelectric elements E1 through E4. Ex at Abstract, (emphasis added) Tsuji also discloses that in addition to an actuator signal generating a vibrational haptic effect, other actuator signals producing additional haptic effects may be provided, such as pulsed displacement: In FIG. 7, the drive mode parameter signal V output from the drive mode selecting unit 72 is applied to the piezoelectric element drive unit 75. The piezoelectric element drive unit 75 has a high frequency WEST\ EXHIBIT PAGE 239

240 oscillation circuit 76, which transmits a high frequency wave specified by the parameter signal V to the piezoelectric elements E1 through E4. Thus, the piezoelectric elements E1 through E4 are vibrated or slightly displaced at a specified amplitude and timing. Ex at [0122], (emphasis added). This is felt by the user because it causes the entire touch panel to vibrate. Ex at [0122], (emphasis added). When one drive mode is selected using the region determining signal SR and the operating force determining signal FB in this way, the parameter values that specify drive modes in FIG. 14 are read from the drive mode storage unit 73, and then applied to a piezoelectric element drive unit 75 in FIG. 7. In response to this, a vibration voltage is applied to the piezoelectric elements E1 through E4 such that the piezoelectric elements E1 through E4 vibrate or are displaced slightly, and this vibration or slight displacement is propagated to the operating surface 11. When the operator presses down on one of the operating regions R1 through R7 with at least a predetermined force, a tactile action intended to notify the operator that the operation was received is generated by a vibration or slight sliding of the operating surface 11. Ex at [0113], (emphasis added) Tsuji discloses that a variety of different drive modes may be defined, stored, and utilized to generate an actuator signal that is applied to the piezoelectric actuator elements E1-E4. Figure 14 of Tsuji depicts examples of actuator signal waveforms corresponding to different drive modes that may be stored in Drive mode storage 73 and utilized to drive the actuators, and is reproduced below: WEST\ EXHIBIT PAGE 240

241 Ex at Figure Examples of actuator signals disclosed by Tsuji and depicted in Figure 14 include continuous vibrations of selectable amplitudes and frequencies, as well as short bursts of vibrations and discrete pulses. This is described by Tsuji as follows: FIG. 14 schematically illustratess a variety of drive mode stored in a drive mode storage unit 73. For example, FIG. 14(a) ) illustratess a mode for performing a continuous vibration having a small amplitude, while FIG 14(b) is a vibration mode having a large amplitude. FIG. 14(c) illustrates a vibration mode having a different frequency than FIG. 14(a) and (b), while FIG 14(d) and (e) illustrate examples where vibration is performed for a short period of time once or twice. Furthermor re, FIG. 14(f) is a vibration mode for applying only a single WEST\ EXHIBIT PAGE 241

242 vibration (one-shot pulse). Note that examples of other modes are described below. Ex at [0110] These and other drive modes are used to provide an actuator signal to the piezoelectric actuator elements which cause a corresponding haptic effect to be output to the touch-sensitive surface, and are selected based in part on the interaction Tsuji discloses this, for example, in conjunction with the ATM embodiment in Figure 4, which is shown below: Ex at Figure This figure depicts a screen that might be displayed in conjunction with a banking application, and includes seven displayed graphical objects comprising soft keys representing different actions such as Deposit, WEST\ EXHIBIT PAGE 242

243 Withdrawal, etc. The coordinates of each of the displayed graphical objects comprising selectable soft keys define the coordinates of the corresponding operating regions R1-R7, as indicated on the figure. It is these region coordinates that the comparing and determining unit 52a (FIG. 10) uses to compare the coordinates (x and y) of the operating point P to the apex coordinates of the operating regions R1 through R7, and to determine[] whether the operating point P is in the region R0 or one of the operating regions R1 through R7. Ex at [0088] Tsuji discloses that an actuator signal is generated at least in part by this determination of both a positional and a force-level interaction between the point of contact of the object, and the region corresponding to the displayed graphical object. Tsuji describes the operation in the first embodiment with respect to Figure 4 as follows: While a variety of information can be displayed on the liquid crystal display panel 20, the example in FIG. 4 illustrates menus of a bank automatic cashier. Regions R1 through R7 displayed by these menus serve as operating regions for bank users. For example, when a bank user presses on region R1 displaying Deposit with his or her finger using at least a predetermined amount of force, the information display device 100, through an operation described below, senses that Deposit has been selected, notifies the bank host computer of this, and then assumes a state in which cash can be accepted. Furthermore, at the same time, the information display surface 21 changes to a screen displaying guidance and a new operating menu for accepting the cash. Note that the size and placement of these operating regions R1 through R7 can be set randomly. Furthermore, region R0 in FIG. 4 WEST\ EXHIBIT PAGE 243

244 illustrates a region in the information display surface 21 that is outside the operating regions R1 through R7. Ex at [0047], (emphasis added). Moreover, the piezoelectric elements E1 through E4 in FIG. 3 are used in the device of the first embodiment as elements combining both sensing means for sensing whether a bank user has pressed any of the operating regions R1 through R7, and driving means for gently vibrating the operating panel 10 based on said pressing. Ex at [0048], (emphasis added) Tsuji discloses that in the ATM embodiment, when the regions are used to display a numerical keyboard, that one-shot haptic effects are generated based on the user interacting (selecting) one of the numerical buttons. Ex at [0120] Tsuji also discloses in the hand-held device embodiment (the PDA embodiment) that an actuator signal is generated at least in part by the determination of both a positional and a force-level interaction between the point of contact of the object, and the displayed graphical object This generation of an actuator signal based on the interaction of a user s fingers contacting the touch screen and graphical objects displayed on the liquid crystal display 20 as selectable soft keys is described with respect to the hand-held device embodiment depicted in Figures 17 and 18. These figures, along with accompanying description of the touchscreen, are reproduced below: WEST\ EXHIBIT PAGE 244

245 Ex at Figures 17 & 18. FIG. 17 is a perspective drawing of the exterior of an information display device 200 according to a third embodiment of the present invention. Ex at [0040]. The operating regions R1 through R4 displayed by the liquid crystal display panel are visible through the operating surface 11 in FIG. 17. Typically, these operating regions R1 through R4 are displayed along both sides of the operating surface. An operator grips the housing 201 by both sides, as illustrated by the broken lines in FIG. 17, and performs operations by pressing these operating regions R1 through R4 with his/her thumbs. When the position of this pressing operation is sensed, if the pressing force thereof is larger than a predetermined threshold, the operation input is accepted, a displayed object 210 (FIG. 18) on a screen changes, and the operating surface 11 is vibrated or slightly displaced based on a predetermined mode. The operation at this point is the same as in the first and second embodiments. Ex at [0146], (emphasis added) As discussed above, in the Volume Control embodiment the WEST\ EXHIBIT PAGE 245

246 interaction additionally includes the Y coordinate of the slider. Figs. 20 (slider 301); 21 (slider 305). As established above, the determination of the Y coordinate can be in continuous or discrete values (e.g., YH, YM, YL). Ex. 1111, [ ]. Thus, in the Volume Control embodiment, Tsuji discloses that the lookup table 72 has three indices Rn (graphical object contacted), Fn (force of contact) and Yd (Y coordinate) that are used to lookup the associated haptic effect data (Snn). This is shown by Tsuji in Figure 22, reproduced below, showing the three inputs R, FC, YD and output V Tsuji discloses Figure 22 is a modification relevant to both the first and second variations of the control circuit CT disclosed in Figures 7 and 16. Ex. 1111, [0155]. Tsuji discloses that the relationship, i.e., the association between this interaction and the haptic effect data is stored in the table. Ex. 1111, WEST\ EXHIBIT PAGE 246

247 [0155] ( The table 72a in the drive mode selecting unit 72 stores relationships for selecting a drive mode to coincide with this Y coordinate3 identifying value YD in a table format.... ). See also, Ex. 1111, [0155]-[0158] Tsuji provides additional disclosure of how the drive mode that produces the actuator signal is selected based on position and force-level interactions at paragraphs [0108] [0126] under a section entitled, Drive Mode Selection. Tsuji explains that the drive mode defines the haptic effect that will be generated, and that this drive mode is based on the classifications of the inoperation region and the operating force, i.e. based on the determined interactions: On the other hand, the drive mode selecting unit 72 in FIG. 7, which inputs the region determining signal SR and an operating force determining signal FB, selects a drive mode based on the classifications of the in-operation region and the operating force. The drive mode defines how the operating surface 11 will be vibrated. Ex at [0108] Thus, for at least the reasons as set forth above, Tsuji discloses a processor configured to to generate the actuator signal based at least in part on the interaction. ii. Tsuji discloses that the processor is configured to generate the actuator signal based at least in part on the interaction stored in a lookup table Tsuji discloses that the interaction is stored in a lookup table, and that WEST\ EXHIBIT PAGE 247

248 the actuator is based in part on a determined interaction stored in the lookup table. Tsuji discloses a lookup table as shown in Figure 13, which is reproduced below: Ex at Figure As can be observed in this figure, a data structure comprising a table 72a is disclosed which utilizes both operating force classifications F1-F4, and region determinations R0, R1-R6 as first and second indices to the table. As described above, for the Volume Controller embodiment, Tsuji discloses that lookup table 72a additionally stores the Y coordinates (e.g., YH, YM, YL) as a third index to the table. Ex at Fig A person of ordinary skill in the art would understand table 72a that WEST\ EXHIBIT PAGE 248

249 holds data that is accessed using a lookup procedure and first and second indices to the table is a lookup table For a given lookup procedure, the operating force value associated with an interaction is specified by the operating force determining signal FB, and is used as a first index to the table. The region value R associated with the interaction is specified by the region determining signal SR, and is used as a second index to the table. Based on the intersection of the column and row specified by region R and associated force F of the interaction, the signals S11, S12, etc. are used to generate the drive mode, which in turn defines the actuator signal This is described by Tsuji as follows: Specifically, as is illustrated in FIG. 13, with which region, R1 through R0, the region determining signal SR belongs to as a first index, and which of the operating force classifications, F1 through F4, a classification expressed by the operating force determining signal FB belongs to as a second index, the drive modes that should be selected for combinations of the first and second indices are stored in Table 72a in advance. Signals S11, S12,... in FIG. 13 are codes for selecting and defining any of a variety of drive modes like, for example, that illustrated in FIG. 14. Ex at [0109], (emphasis added) As discussed previously, [t]he drive mode defines how the operating surface 11 will be vibrated, [0108], i.e. it defines the actuator signal to be applied to the piezoelectric actuators. Figure 14 of Tsuji depicts examples of actuator WEST\ EXHIBIT PAGE 249

250 signal waveforms corresponding to different drive modes that may be stored in Drive mode storage 73 and utilized to drive the actuators, and is reproduced below: Ex at Figure Tsuji explains that the drive modes can be defined by specifying the parameters for frequency (VF), amplitude (VT), and duration (VD) of the actuator signal, as depictedd in Figure 14(d) above. Ex at [0111] 550. Tsuji further explains thatt the lookup table procedure using the operating force determining signal FB (specifying the force index F) and the region determining value SR (specifying the region index R) is used to select the parameter for the drive mode which is used to generate the actuator signal: WEST\ EXHIBIT PAGE 250

251 When one drive mode is selected using the region determining signal SR and the operating force determining signal FB in this way, the parameter values that specify drive modes in FIG. 14 are read from the drive mode storage unit 73, and then applied to a piezoelectric element drive unit 75 in FIG. 7. In response to this, a vibration voltage is applied to the piezoelectric elements E1 through E4 such that the piezoelectric elements E1 through E4 vibrate or are displaced slightly, and this vibration or slight displacement is propagated to the operating surface 11. When the operator presses down on one of the operating regions R1 through R7 with at least a predetermined force, a tactile action intended to notify the operator that the operation was received is generated by a vibration or slight sliding of the operating surface 11. Ex at [0113] Similarly, a person of ordinary skill in the art would understand table 72a within drive mode selecting unit 72 as shown in Figure 22 which holds data that is accessed using a lookup procedure using first, second and third indices to the table is also lookup table. Tsuji discloses that the relationship, i.e., the association between this interaction and the haptic effect data is stored in the table 72a. Ex. 1111, [0155] ( The table 72a in the drive mode selecting unit 72 stores relationships for selecting a drive mode to coincide with this Y coordinate3 identifying value YD in a table format.... ). See also, Ex. 1111, [0155]-[0158] Thus, as shown above, the processor is configured to generate the actuator signal based at least in part on the interaction... stored in a lookup table. WEST\ EXHIBIT PAGE 251

252 iii. Tsuji discloses and/or renders obvious a processor that is configured to generate the actuator signal based at least in part on the interaction and haptic effect data stored in a lookup table As I described above, Tsuji discloses a lookup table as shown in Figure 13, which is reproduced below: Ex at Figure As can be observed in this figure, a data structure comprising a table 72a is disclosed which utilizes both operating force classifications F1-F4, and region determinations R0, R1-R2, and R6 as first and second indices to the table. As described above, for the Volume Controller embodiment, Tsuji discloses that WEST\ EXHIBIT PAGE 252

253 lookup table 72a additionally stores the Y coordinates (e.g., YH, YM, YL) as a third index to the table. Ex at Fig. 22. A person of ordinary skill in the art would understand table 72a that holds data that is accessed using a lookup procedure and one or more indices to the table is a lookup table For a given lookup procedure, the operating force value associated with an interaction is specified by the operating force determining signal FB, and is used as a first index to the table. The region value R associated with the interaction is specified by the region determining signal SR, and is used as a second index to the table. Based on the intersection of the column and row specified by region R and associated force F of the interaction, the signals S11, S12, etc. are used to generate the drive mode, which in turn defines the actuator signal This is described by Tsuji as follows: Specifically, as is illustrated in FIG. 13, with which region, R1 through R0, the region determining signal SR belongs to as a first index, and which of the operating force classifications, F1 through F4, a classification expressed by the operating force determining signal FB belongs to as a second index, the drive modes that should be selected for combinations of the first and second indices are stored in Table 72a in advance. Signals S11, S12,... in FIG. 13 are codes for selecting and defining any of a variety of drive modes like, for example, that illustrated in FIG. 14. Ex at [0109], (emphasis added) As established above, in the Volume Controller embodiment, the WEST\ EXHIBIT PAGE 253

254 lookup table 72a is modified to additionally contain associations between the Y coordinate interaction and the haptic effect data. Ex. 1111, [0155] ( The table 72a in the drive mode selecting unit 72 stores relationships for selecting a drive mode to coincide with this Y coordinate identifying value YD in a table format.... ) (emphasis added). See also, Ex. 1111, [0155]-[0158] As described by Tsuji above, signals S11, S12, etc. are stored in the lookup table, and these signals are codes for selecting and defining any of a variety of drive modes like, for example, that illustrated in FIG. 14. Thus, the signals identify and define the haptic effect Tsuji also describes signals S11, S12 that are stored in the lookup table 72a as parameter codes which identify the different drive modes having vibration frequency (VF), vibration amplitude (VD), and vibration time (VT) parameters. Ex at [0111] Tsuji discloses that once a drive mode is selected from the lookup table using the region determining signal SR and the operating force determining signal FB as indices to the table, the parameter code stored in lookup table 72a corresponding to these indices is used to retrieve the corresponding parameter values: When one drive mode is selected using the region determining signal SR and the operating force determining signal FB in this way, the parameter values that specify drive modes in FIG. 14 are read from the drive mode storage unit 73, and then applied to a piezoelectric WEST\ EXHIBIT PAGE 254

255 element drive unit 75 in FIG. 7. In response to this, a vibration voltage is applied to the piezoelectric elements E1 through E4 such that the piezoelectric elements E1 through E4 vibrate or are displaced slightly, and this vibration or slight displacement is propagated to the operating surface 11. When the operator presses down on one of the operating regions R1 through R7 with at least a predetermined force, a tactile action intended to notify the operator that the operation was received is generated by a vibration or slight sliding of the operating surface 11. Ex at [0113] The drive mode storage unit 73 from which the parameter values that specify the drive modes are read is shown in Figure 7, indicated by the red box: Ex at Figure 7, with annotation added Thus, as established above, Tsuji discloses a data structure in the form of a table (table 72a) containing associations between interactions (Rn and Fn (and WEST\ EXHIBIT PAGE 255

256 Yd for Volume Controller embodiments)) and haptic effect data (Snn). Limitation 12.h: transmit the actuator signal to the actuator Tsuji discloses a processor configured to transmit the actuator signal to the actuator As I showed above in my analysis directed to claim element 12c, which I incorporate here by reference, Tsuji discloses a processor comprising control circuit CT, as shown in Figure 7 and 16. As I established above with respect to Figures 7 and 16, the piezoelectric actuators are configured to receive an actuator signal delivered via drive unit 75. Tsuji describes how this actuator signal is transmitted to the actuator and causes the piezoelectric actuator elements E1-E4 to generate a haptic effect, such as a vibration: High frequency is applied to the piezoelectric elements E1 through E4 when an operating force larger than a predetermined threshold is sensed, which thus causes the operating surface 11 to vibrate. This vibration allows an operator to obtain a reliable sense of operation. The number of parts is small because the sensing of the operating force applied to the operating surface and the application of the vibration to the operating surface 11 are performed using the common piezoelectric elements E1 through E4. Ex at Abstract, (emphasis added) Tsuji also discloses that actuator signals of various types, representing the different drive modes discussed above, are transmitted to the actuator to produce the haptic effects: WEST\ EXHIBIT PAGE 256

257 In FIG. 7, the drive mode parameter signal V output from the drive mode selecting unit 72 is applied to the piezoelectric element drive unit 75. The piezoelectric element drive unit 75 has a high frequency oscillation circuit 76, which transmits a high frequency wave specified by the parameter signal V to the piezoelectric elements E1 through E4. Thus, the piezoelectric elements E1 through E4 are vibrated or slightly displaced at a specified amplitude and timing. Ex at [0122], (emphasis added). This is felt by the user because it causes the entire touch panel to vibrate. Ex at [0123], (emphasis added) Tsuji further describes actuator signals producing both vibration and displacement being transmitted to the actuator: When one drive mode is selected using the region determining signal SR and the operating force determining signal FB in this way, the parameter values that specify drive modes in FIG. 14 are read from the drive mode storage unit 73, and then applied to a piezoelectric element drive unit 75 in FIG. 7. In response to this, a vibration voltage is applied to the piezoelectric elements E1 through E4 such that the piezoelectric elements E1 through E4 vibrate or are displaced slightly, and this vibration or slight displacement is propagated to the operating surface 11. When the operator presses down on one of the operating regions R1 through R7 with at least a predetermined force, a tactile action intended to notify the operator that the operation was received is generated by a vibration or slight sliding of the operating surface 11. Ex at [0113], (emphasis added) Tsuji discloses that a variety of different drive modes may be defined, stored, and utilized to generate an actuator signal that is ultimately transmitted to and applied to the piezoelectric actuator elements E1-E4. Figure 14 of Tsuji WEST\ EXHIBIT PAGE 257

258 depicts examples of actuator signal waveforms corresponding to different drive modes that may be stored in Drive mode storagee 73 and utilized to drive the actuators, and is reproducedd below: Ex at Figure These and other drive modes are used to generate and transmit an actuator signal to the piezoelectric actuator elements whichh cause a corresponding haptic effect to be output to the touch-sensitive surface, as further described by Tsuji below: When one drive mode is selected using thee region determining signal SR and the operating force determining signal FB in this way, the parameter values that specify drive modess in FIG. 14 are read from WEST\ EXHIBIT PAGE 258

259 the drive mode storage unit 73, and then applied to a piezoelectric element drive unit 75 in FIG. 7. In response to this, a vibration voltage is applied to the piezoelectric elements E1 through E4 such that the piezoelectric elements E1 through E4 vibrate or are displaced slightly, and this vibration or slight displacement is propagated to the operating surface 11. Ex at [0113], (emphasis added) As I have established above, Tsuji discloses that the control circuit CT that may comprise an MPU (which includes drive unit 75) transmits the actuator signal to the piezoelectric elements, as described in Figures 7 and 16, and accompanying text. Additional disclosures demonstrating that the actuator signal that is generated is subsequently transmitted to the actuator to produce the desired haptic effect at the surface are provided below: Furthermore, a mechanical reaction generated by the bi-directional function means through the drive signal is transmitted to the operating surface and is captured as a sense of touch of an operator. Ex [0021]. Furthermore, a mechanical reaction of the bi-directional function means caused by the drive signal is transmitted to the operating surface as a sense of touch of an operator. Ex at [0038]. Through this kind of configuration, while a parameter signal V for a drive mode specified using a corresponding location in the table 72a is output to the piezoelectric element drive unit 75 when the operating force determining signal FB is active, no drive mode information of any kind is output to the piezoelectric element drive unit 75 when the operating force determining signal FB is not active. Therefore, the operating surface 11 is only made to vibrate or displace slightly when WEST\ EXHIBIT PAGE 259

260 an operating force F equal to or greater than the minimum threshold Fh1 is applied to the operating surface 11. Ex at [0115]. In FIG. 7, the drive mode parameter signal V output from the drive mode selecting unit 72 is applied to the piezoelectric element drive unit 75. Ex at [0122], (emphasis added) As further evidence supporting my opinion that Tsuji discloses this limitation, see also: Ex at [0020], [0022], [0037], [0081], [0121], [0126], [ ], [0178]; Figure 13; Claims 1 ( drive control means for sending an electric drive signal to the bi-directional function means in response to the operating signal, ), 2 ( operating signal determining means for comparing the operating signal to a predetermined threshold and sending the drive signal to the bi-directional function means when the operating signal exceeds said threshold ), 3, 15 ( drive signals ) See also the evidence and information cited for claim limitations 12.c and 12.d, which is incorporated here by reference. Claim 13: The system of claim 12, wherein the processor is configured to generate the actuator signal when the object contacts the touch-sensitive input device at a location corresponding to the graphical object Tsuji discloses that the processor is configured to generate the actuator signal when the object contacts the touch-sensitive input device at a location corresponding to the graphical object. I established this in my analysis of WEST\ EXHIBIT PAGE 260

261 claim limitations 12.d and 12.g, which are incorporated here by reference For example, as established above for claim 12, the control circuit CT comprising, for example the disclosed MPU software embodiment is configured to generate the actuator signal when the object contacts the touch panel at a location corresponding to one of graphical objects R1 through R7, as shown, for example, by the input SR to unit 72 in Figure 13. Tsuji s disclosure is very clear that the area of the graphical object is defined and the location of the touch compared to those areas to determine which, if any, of graphical objects R1-R7 is contacted. Tsuji at Figure 9 and accompanying text. In fact, Tsuji discloses determining when no touch is occurring (SP at non active level) or when contact with the touchscreen occurs, but no graphical object is contacted (R0). Ex at [0087]-[0091]. Each of the various graphical objects R1-R7, and non-graphical object R0, has a specific haptic effect associated with it, which may vary based on the level of force being applied as represented by force signal FB. Ex at Figure 13 and accompanying text. The MPU is configured to output the appropriate actuator signal (V) when the touch panel is touched at a location corresponding to any of graphical objects R1-R I will provide additional details of the operation of Tsuji with respect to the manner in which the disclosed processor is configured to generate the actuator signal when the object contacts the touch-sensitive input device at a WEST\ EXHIBIT PAGE 261

262 location corresponding to the graphical object below As I discussed previously with respect to claim element 12f, which I incorporate here by reference, Tsuji discloses determining both a positional and a force-level interaction between the object contacting the touch panel 11 (or 10T) and a graphical object (signal SR indicating which icon R1 through R7 was contacted) With respect to determining a positional interaction, Tsuji discloses that region determining unit 52 determines an interaction between the object contacting the touch panel 11 (or 10T) and a graphical object (signal SR indicating which icon R1 through R7 was contacted) The first step of this process is determining the location on the surface where an object such as a finger contacts the touch-sensitive surface. Tsuji discloses that the processor is configured to determine this location: Pressing an operating surface 11 of the operating panel 10 with a finger generates voltage at both ends of the piezoelectric elements E1 through E4, and an operating force and an operating position are sensed by detecting and calculating said voltage. Ex at Abstract With respect to the positional determination in the first embodiment, Tsuji describes the principles for sensing the operating position at paragraphs [0049]-[0080]: WEST\ EXHIBIT PAGE 262

263 Before describing the rest of the configuration of this device, the principles by which the piezoelectric elements E1 through E4 are used to sense which of the operating regions R1 through R7 have been pressed will be described. Ex at [0049] In summary, the procedures described at paragraphs [0049]-[0080] utilize the individual pressing force F signals f k from each of the piezoelectric elements E1-E4 to calculate the x and y coordinates of the point of contact P on the surface: Moreover, a case is imagined where the operating panel 10M is pressed downward at a position at point P (x and y) with a pressing force F. At this time, the principle for sensing (x and y), which are the XY coordinates of point P, using the function of the piezoelectric elements E1 through En, is as described below. Ex at [0051]. Distance constants a and b (see FIG. 6) saved in advance in a constant storage unit 51c are also applied to the position computing unit 51b, and the position computing unit 51b calculates the position coordinates (x and y) for an operating point using Number 16 and Number 17 described above. Ex at [0084] As described above, the calculations presented at [0049]-[0080] of Tsuji provide the XY coordinates of the point of contact P at which an object such as a finger contacts the display surface in the first embodiment. The operating point referenced in the passage above comprises the point of contact in XY coordinates on the operating surface 10, and is represented by a signal SP output WEST\ EXHIBIT PAGE 263

264 by the operating position specification unit 51, shown in Figure 7: Ex at Figure 7, annotation added The operating position signal SP representing the x and y coordinates of the point of contact of an object of the first embodiment as shown in Figure 7 above is described by Tsuji as follows: As a result, an operating position signal SP showing the operating position P (x and y) and an operating force signal SF showing the operating force F are both output from the operation unit 51. The operating position signal SP has two components (x and y). Ex at [0086], (emphasis added) With respect to the second embodiment, I previously described that Tsuji discloses use of a resistive, capacitive, optical or acoustic touch panel to determine the point of contact at which an object such as a finger contacts the WEST\ EXHIBIT PAGE 264

265 display surface in my discussion of claim element 12a, which I incorporate heree by reference This embodiment is depicted in Figure 15, which is reproduced below with annotation added in red font to identify relevant numeric components: 8 Window Touch Panel 10T and Surface 11 LCD Display Info. Display Panel 20 Surface 21 Piezoelectric Elements E1-E4 Ex at Figure 15, annotations added Tsuji discloses that the touch panel 10T comprising, for example, a resistivee touch panel, will provide the XY coordinates of the point of contact between an object and the surface of the touch-sensitive device. This is described, for example, in the following passages: In FIG.15, the display operating unit DP allows an operator to specify an operating position using a touch panel 10T. The touch panel 10T is, for example, a resistive type panel having transparent electrodes arranged in an orthogonal matrix of M rows and N columns in an XY 8 The identification of these numbered elements can be found in Tsuji at [0133] and [0137], the latter paragraph further indicating that the remaining configuration of the display operating unit DP in FIG. 15 is the same as thatt in FIG 3... WEST\ EXHIBIT PAGE 265

266 plane on a transparent substrate. With this panel, the intersections of these rows and columns form switches, each cell of the matrix is considered to be a unit, and XY direction operating position signals are output. Ex at [0134], (emphasis added). On the other hand, while the terminal voltages e k (K = 1 to 4) of the piezoelectric elements E1 through E4 are each applied in parallel to an operation unit 51F, the operation unit 51F is equivalent to a unit where the position calculating unit 51b has been omitted from the configuration in FIG. 8. That is, because the touch panel 10T specifies the operating position in the second embodiment, all that has to be calculated from the output voltage of the piezoelectric elements E1 through E4 is the total operating force F. Ex at [0141], (emphasis added) The operating position referenced in the passages above comprises the point of contact in XY coordinates on the touch panel 10T, and is represented by an operating position signal SP output by the operating position specification unit 51T, shown in Figure 16: WEST\ EXHIBIT PAGE 266

267 Ex at Figure 16, annotation added Thus, as shown above, the embodiments of the touch-sensitive input device disclosed in Tsuji each determine the location on the surface where an object such as a finger contacts the touch-sensitive surface The next step in determining a positional interaction between the object contacting the touch-sensitive surface andd the graphical object being displayed is determining whether the point of contact represented by operating position signal SP is inside of a region corresponding to a displayed graphical object. This particular state of interaction is what is specifically required by claim 13. As I will describe next, Tsuji discloses that this function is performed by the control circuit processor function comprising region determining unit 52, whichh he WEST\ EXHIBIT PAGE 267

UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD APPLE INC., Petitioner v. IMMERSION CORPORATION, Patent Owner U.S.

UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD APPLE INC., Petitioner v. IMMERSION CORPORATION, Patent Owner U.S. UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD, Petitioner v. IMMERSION CORPORATION, Patent Owner U.S. Patent No. 7,808,488 Filing Date: March 29, 2007 Issue Date: October

More information

UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD APPLE INC., Petitioner v. IMMERSION CORPORATION, Patent Owner U.S.

UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD APPLE INC., Petitioner v. IMMERSION CORPORATION, Patent Owner U.S. UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD, Petitioner v. IMMERSION CORPORATION, Patent Owner U.S. Patent No. 8,581,710 Filing Date: September 5, 2012 Issue Date:

More information

APPEAL DECISION. Appeal No USA. Tokyo, Japan. Tokyo, Japan. Tokyo, Japan. Tokyo, Japan

APPEAL DECISION. Appeal No USA. Tokyo, Japan. Tokyo, Japan. Tokyo, Japan. Tokyo, Japan APPEAL DECISION Appeal No. 2013-6730 USA Appellant IMMERSION CORPORATION Tokyo, Japan Patent Attorney OKABE, Yuzuru Tokyo, Japan Patent Attorney OCHI, Takao Tokyo, Japan Patent Attorney TAKAHASHI, Seiichiro

More information

IN THE UNITED STATES PATENT AND TRADEMARK OFFICE PATENT TRIAL & APPEAL BOARD

IN THE UNITED STATES PATENT AND TRADEMARK OFFICE PATENT TRIAL & APPEAL BOARD DOCKET NO: 500289US IN THE UNITED STATES PATENT AND TRADEMARK OFFICE PATENT TRIAL & APPEAL BOARD PATENT: 8,174,506 INVENTOR: TAE HUN KIM et al. TITLE: METHOD OF DISPLAYING OBJECT AND TERMINAL CAPABLE OF

More information

Date: August 27, 2013 UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD. ionroad LTD.

Date: August 27, 2013 UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD. ionroad LTD. Trials@uspto.gov Paper No.17 571-272-7822 Date: August 27, 2013 UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD ionroad LTD., Petitioner, v. MOBILEYE TECHNOLOGIES LTD.,

More information

IN THE UNITED STATES PATENT AND TRADEMARK OFFICE

IN THE UNITED STATES PATENT AND TRADEMARK OFFICE DOCKET NO: 723-3922 IN THE UNITED STATES PATENT AND TRADEMARK OFFICE PATENT: 6,864,796 TRIAL NO: IPR2015-00109 INVENTORS: Michael L. Lehrman, Alan R. Owens, Michael E. Halleck and Edward L. Massman FILED:

More information

April 1, Patent Application Pitfall: Federal Circuit Affirms Invalidity of Software Patent for Inadequate Disclosure

April 1, Patent Application Pitfall: Federal Circuit Affirms Invalidity of Software Patent for Inadequate Disclosure April 1, 2008 Client Alert Patent Application Pitfall: Federal Circuit Affirms Invalidity of Software Patent for Inadequate Disclosure by James G. Gatto On March 28, 2008, the Federal Circuit affirmed

More information

IN THE UNITED STATES PATENT AND TRADEMARK OFFICE CODING SYSTEM FOR REDUCING REDUNDANCY ATTACHMENT TO FORM PTO-1465, REQUEST FOR EX PARTE REEXAMINATION

IN THE UNITED STATES PATENT AND TRADEMARK OFFICE CODING SYSTEM FOR REDUCING REDUNDANCY ATTACHMENT TO FORM PTO-1465, REQUEST FOR EX PARTE REEXAMINATION IN THE UNITED STATES PATENT AND TRADEMARK OFFICE PATENT NO.: 4,698,672 ISSUED: October 6, 1987 FOR: CODING SYSTEM FOR REDUCING REDUNDANCY ATTACHMENT TO FORM PTO-1465, REQUEST FOR EX PARTE REEXAMINATION

More information

SUPREME COURT OF THE UNITED STATES

SUPREME COURT OF THE UNITED STATES (Bench Opinion) OCTOBER TERM, 2006 1 NOTE: Where it is feasible, a syllabus (headnote) will be released, as is being done in connection with this case, at the time the opinion is issued. The syllabus constitutes

More information

THE AMERICA INVENTS ACT NEW POST-ISSUANCE PATENT OFFICE PROCEEDINGS

THE AMERICA INVENTS ACT NEW POST-ISSUANCE PATENT OFFICE PROCEEDINGS THE AMERICA INVENTS ACT NEW POST-ISSUANCE PATENT OFFICE PROCEEDINGS By Sharon Israel and Kyle Friesen I. Introduction The recently enacted Leahy-Smith America Invents Act ( AIA ) 1 marks the most sweeping

More information

United States Court of Appeals for the Federal Circuit

United States Court of Appeals for the Federal Circuit NOTE: This disposition is nonprecedential. United States Court of Appeals for the Federal Circuit 2012-1692 Appeal from the United States Patent and Trademark Office, Patent Trial and Appeal Board in serial

More information

Intellectual Property Overview

Intellectual Property Overview Intellectual Property Overview Sanjiv Chokshi, Esq. Assistant General Counsel For Patents and Intellectual Property Office of General Counsel Fenster Hall- Suite 480 (973) 642-4285 Chokshi@njit.edu Intellectual

More information

Paper Filed: January 27, 2015 UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD

Paper Filed: January 27, 2015 UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD Trials@uspto.gov Paper 72 571-272-7822 Filed: January 27, 2015 UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD CARDIOCOM, LLC, Petitioner, v. ROBERT BOSCH HEALTHCARE

More information

Paper Entered: April 1, 2016 UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD

Paper Entered: April 1, 2016 UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD Trials@uspto.gov Paper 24 571 272 7822 Entered: April 1, 2016 UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD UBISOFT, INC. and UBISOFT ENTERTAINMENT SA, Petitioner,

More information

UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD. BUNGIE, INC., Petitioner, WORLDS INC., Patent Owner.

UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD. BUNGIE, INC., Petitioner, WORLDS INC., Patent Owner. Filed on behalf of: Bungie, Inc. By: Michael T. Rosato Matthew A. Argenti WILSON SONSINI GOODRICH & ROSATI 701 Fifth Avenue, Suite 5100 Seattle, WA 98104-7036 Tel.: 206-883-2529 Fax: 206-883-2699 Email:

More information

Essay No. 1 ~ WHAT CAN YOU DO WITH A NEW IDEA? Discovery, invention, creation: what do these terms mean, and what does it mean to invent something?

Essay No. 1 ~ WHAT CAN YOU DO WITH A NEW IDEA? Discovery, invention, creation: what do these terms mean, and what does it mean to invent something? Essay No. 1 ~ WHAT CAN YOU DO WITH A NEW IDEA? Discovery, invention, creation: what do these terms mean, and what does it mean to invent something? Introduction This article 1 explores the nature of ideas

More information

UNITED STATES DISTRICT COURT NORTHERN DISTRICT OF CALIFORNIA I. INTRODUCTION

UNITED STATES DISTRICT COURT NORTHERN DISTRICT OF CALIFORNIA I. INTRODUCTION 1 1 1 1 1 1 1 0 1 FREE STREAM MEDIA CORP., v. Plaintiff, ALPHONSO INC., et al., Defendants. UNITED STATES DISTRICT COURT NORTHERN DISTRICT OF CALIFORNIA I. INTRODUCTION Case No. 1-cv-0-RS ORDER DENYING

More information

Ryan N. Phelan. Tel

Ryan N. Phelan. Tel Ryan N. Phelan Partner Tel 312.474.6607 rphelan@marshallip.com Ryan N. Phelan is a registered patent attorney who counsels and works with clients in intellectual property (IP) matters, with a focus on

More information

Introduction Disclose at Your Own Risk! Prior Art Searching - Patents

Introduction Disclose at Your Own Risk! Prior Art Searching - Patents Agenda Introduction Disclose at Your Own Risk! Prior Art Searching - Patents Patent Basics Understanding Different Types of Searches Tools / Techniques for Performing Searches Q&A Searching on Your Own

More information

Loyola University Maryland Provisional Policies and Procedures for Intellectual Property, Copyrights, and Patents

Loyola University Maryland Provisional Policies and Procedures for Intellectual Property, Copyrights, and Patents Loyola University Maryland Provisional Policies and Procedures for Intellectual Property, Copyrights, and Patents Approved by Loyola Conference on May 2, 2006 Introduction In the course of fulfilling the

More information

Haptic Feedback Technology

Haptic Feedback Technology Haptic Feedback Technology ECE480: Design Team 4 Application Note Michael Greene Abstract: With the daily interactions between humans and their surrounding technology growing exponentially, the development

More information

UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD. UBISOFT, INC. AND UBISOFT ENTERTAINMENT SA Petitioner

UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD. UBISOFT, INC. AND UBISOFT ENTERTAINMENT SA Petitioner UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD UBISOFT, INC. AND UBISOFT ENTERTAINMENT SA Petitioner v. GUITAR APPRENTICE, INC. Patent Owner Case No. TBD Patent No.

More information

Patent Basics for Inventors, Entrepreneurs, and Start-ups

Patent Basics for Inventors, Entrepreneurs, and Start-ups Patent Basics for Inventors, Entrepreneurs, and Start-ups Daniel Kolker, Ph.D. Supervisory Patent Examiner United States Patent and Trademark Office Daniel.Kolker@USPTO.gov Outline Why Patents? Types of

More information

UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD MITEK SYSTEMS, INC. Petitioner

UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD MITEK SYSTEMS, INC. Petitioner Paper No. Filed: January 26, 2015 Filed on behalf of: Mitek Systems, Inc. By: Naveen Modi Joseph E. Palys Paul Hastings LLP 875 15th Street NW Washington, DC 20005 Telephone: (202) 551-1990 Facsimile:

More information

McRO Syncs Automation Software With Patent Eligibility

McRO Syncs Automation Software With Patent Eligibility Portfolio Media. Inc. 111 West 19 th Street, 5th Floor New York, NY 10011 www.law360.com Phone: +1 646 783 7100 Fax: +1 646 783 7161 customerservice@law360.com McRO Syncs Automation Software With Patent

More information

UNITED STATES DISTRICT COURT SOUTHERN DISTRICT OF CALIFORNIA. Plaintiffs, Defendant.

UNITED STATES DISTRICT COURT SOUTHERN DISTRICT OF CALIFORNIA. Plaintiffs, Defendant. 1 1 WI-LAN USA, INC. and WI-LAN, INC., vs. APPLE INC., UNITED STATES DISTRICT COURT SOUTHERN DISTRICT OF CALIFORNIA Plaintiffs, Defendant. AND RELATED COUNTERCLAIMS. CASE NO. 1cv0 DMS (BLM) ORDER CONSTRUING

More information

The Uneasy Future of Software and Business-Method Patents

The Uneasy Future of Software and Business-Method Patents The Uneasy Future of Software and Business-Method Patents SD Times March 24, 2010 Yoches, E. Robert, Arner, Erika Harmon, Dubal, Uttam G. Protecting and enforcing IP rights in a high-speed world The world

More information

Paper Entered: 2 February 2017 UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD

Paper Entered: 2 February 2017 UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD Trials@uspto.gov Paper 8 571-272-7822 Entered: 2 February 2017 UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD WANGS ALLIANCE CORPORATION d/b/a WAC LIGHTING CO., Petitioner,

More information

Patents. What is a patent? What is the United States Patent and Trademark Office (USPTO)? What types of patents are available in the United States?

Patents. What is a patent? What is the United States Patent and Trademark Office (USPTO)? What types of patents are available in the United States? What is a patent? A patent is a government-granted right to exclude others from making, using, selling, or offering for sale the invention claimed in the patent. In return for that right, the patent must

More information

UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD

UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD UNITED STATES PATENT AND TRADEMARK OFFICE ------------------------ BEFORE THE PATENT TRIAL AND APPEAL BOARD ------------------------ UBER TECHNOLOGIES, INC. Petitioner v. X ONE, INC. Patent Owner ------------------------

More information

Paper Entered: August 12, 2014 UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD

Paper Entered: August 12, 2014 UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD Trials@uspto.gov Paper 70 571-272-7822 Entered: August 12, 2014 UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD GOOGLE INC. and APPLE INC., Petitioners, v. JONGERIUS

More information

TEPZZ 8 5ZA_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION

TEPZZ 8 5ZA_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION (19) TEPZZ 8 ZA_T (11) EP 2 811 A1 (12) EUROPEAN PATENT APPLICATION (43) Date of publication:.12.14 Bulletin 14/0 (21) Application number: 13170674.9 (1) Int Cl.: G0B 19/042 (06.01) G06F 11/00 (06.01)

More information

The opinion in support of the decision being entered today was not written for publication and is not binding precedent of the Board.

The opinion in support of the decision being entered today was not written for publication and is not binding precedent of the Board. The opinion in support of the decision being entered today was not written for publication and is not binding precedent of the Board. UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE BOARD OF PATENT

More information

Outline 3/16/2018. Patent Basics for Inventors, Entrepreneurs, and Start-ups.

Outline 3/16/2018. Patent Basics for Inventors, Entrepreneurs, and Start-ups. Patent Basics for Inventors, Entrepreneurs, and Start-ups innovationdevelopment@uspto.gov Outline Why Patents? Types of Patents Patent Examiner Duty Understanding Obviousness Patent Examination Process

More information

New Emphasis on the Analytical Approach of Apportionment In Determination of a Reasonable Royalty

New Emphasis on the Analytical Approach of Apportionment In Determination of a Reasonable Royalty New Emphasis on the Analytical Approach of Apportionment In Determination of a Reasonable Royalty James E. Malackowski, Justin Lewis and Robert Mazur 1 Recent court decisions have raised the bar with respect

More information

'Ordinary' Skill In The Art After KSR

'Ordinary' Skill In The Art After KSR Portfolio Media, Inc. 648 Broadway, Suite 200 New York, NY 10012 www.law360.com Phone: +1 212 537 6331 Fax: +1 212 537 6371 customerservice@portfoliomedia.com 'Ordinary' Skill In The Art After KSR Law360,

More information

(51) Int Cl.: G06F 3/041 ( ) H03K 17/96 ( )

(51) Int Cl.: G06F 3/041 ( ) H03K 17/96 ( ) (19) TEPZZ 46_ B_T (11) EP 2 461 233 B1 (12) EUROPEAN PATENT SPECIFICATION (45) Date of publication and mention of the grant of the patent: 02.04.2014 Bulletin 2014/14 (21) Application number: 10804118.7

More information

PartVII:EXAMINATION GUIDELINES FOR INVENTIONS IN SPECIFIC FIELDS

PartVII:EXAMINATION GUIDELINES FOR INVENTIONS IN SPECIFIC FIELDS PartVII:EXAMINATION GUIDELINES FOR INVENTIONS IN SPECIFIC FIELDS Chapter 1 Computer Software-Related Inventions 1. Description Requirements of the Specification 3 1. 1 Claim(s) 3 1.1.1 Categories of Software-Related

More information

United States Court of Appeals for the Federal Circuit

United States Court of Appeals for the Federal Circuit United States Court of Appeals for the Federal Circuit 04-1267 (Serial No. 09/122,198) IN RE DANIEL S. FULTON and JAMES HUANG Garth E. Janke, Birdwell & Janke, of Portland, Oregon, for appellants. John

More information

In the electronics and software fields, the applications that

In the electronics and software fields, the applications that Akihiro Ryuka and Stephen Hamon of Ryuka IP Law Firm present a strategy for obtaining strong IP rights in electronics and software Patent visualisation In the electronics and software fields, the applications

More information

Bangkok, August 22 to 26, 2016 (face-to-face session) August 29 to October 30, 2016 (follow-up session) Claim Drafting Techniques

Bangkok, August 22 to 26, 2016 (face-to-face session) August 29 to October 30, 2016 (follow-up session) Claim Drafting Techniques WIPO National Patent Drafting Course organized by the World Intellectual Property Organization (WIPO) in cooperation with the Department of Intellectual Property (DIP), Ministry of Commerce of Thailand

More information

Patent Basics for Inventors, Entrepreneurs, and Start-ups. Ned Landrum Patent Training Advisor STEPP Program Manager

Patent Basics for Inventors, Entrepreneurs, and Start-ups. Ned Landrum Patent Training Advisor STEPP Program Manager Patent Basics for Inventors, Entrepreneurs, and Start-ups Ned Landrum Patent Training Advisor STEPP Program Manager innovationdevelopment@uspto.gov Outline Why Patents? Types of Patents Patent Examiner

More information

Killing One Bird with Two Stones: Pharmaceutical Patents in the Wake of Pfizer v Apotex and KSR v Teleflex

Killing One Bird with Two Stones: Pharmaceutical Patents in the Wake of Pfizer v Apotex and KSR v Teleflex Killing One Bird with Two Stones: Pharmaceutical Patents in the Wake of Pfizer v Apotex and KSR v Teleflex Janis K. Fraser, Ph.D., J.D. June 5, 2007 The pre-apocalypse obviousness world Pfizer v. Apotex

More information

Haptic Technologies Consume Minimal Power in Smart Phones. August 2017

Haptic Technologies Consume Minimal Power in Smart Phones. August 2017 Haptic Technologies Consume Minimal Power in Smart Phones August 2017 Table of Contents 1. ABSTRACT... 1 2. RESEARCH OVERVIEW... 1 3. IMPACT OF HAPTICS ON BATTERY CAPACITY FOR SIX USE-CASE SCENARIOS...

More information

Standard-Essential Patents

Standard-Essential Patents Standard-Essential Patents Richard Gilbert University of California, Berkeley Symposium on Management of Intellectual Property in Standard-Setting Processes October 3-4, 2012 Washington, D.C. The Smartphone

More information

United States Court of Appeals for the Federal Circuit

United States Court of Appeals for the Federal Circuit United States Court of Appeals for the Federal Circuit CORE WIRELESS LICENSING S.A.R.L., Plaintiff-Appellant v. APPLE INC., Defendant-Appellee 2015-2037 Appeal from the United States District Court for

More information

How To Draft Patents For Future Portfolio Growth

How To Draft Patents For Future Portfolio Growth For the latest breaking news and analysis on intellectual property legal issues, visit Law today. www.law.com/ip Portfolio Media. Inc. 860 Broadway, 6th Floor New York, NY 10003 www.law.com Phone: +1 646

More information

11th Annual Patent Law Institute

11th Annual Patent Law Institute INTELLECTUAL PROPERTY Course Handbook Series Number G-1316 11th Annual Patent Law Institute Co-Chairs Scott M. Alter Douglas R. Nemec John M. White To order this book, call (800) 260-4PLI or fax us at

More information

Paper Entered: January 11, 2017 UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD

Paper Entered: January 11, 2017 UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD Trials@uspto.gov Paper 7 571-272-7822 Entered: January 11, 2017 UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD APPLE INC., Petitioner, v. IMMERSION CORPORATION, Patent

More information

2017 Author Biographies

2017 Author Biographies 2017 Author Biographies Rouget F. (Ric) Henschel Chapter 1: The State of the Law of Claim Construction and Infringement Rouget F. (Ric) Henschel is a partner at Foley & Lardner LLP. He is a member of the

More information

Exhibit 2 Declaration of Dr. Chris Mack

Exhibit 2 Declaration of Dr. Chris Mack STC.UNM v. Intel Corporation Doc. 113 Att. 5 Exhibit 2 Declaration of Dr. Chris Mack Dockets.Justia.com UNITED STATES DISTRICT COURT DISTRICT OF NEW MEXICO STC.UNM, Plaintiff, v. INTEL CORPORATION Civil

More information

Next Generation Haptics: Market Analysis and Forecasts

Next Generation Haptics: Market Analysis and Forecasts Next Generation Haptics: Market Analysis and Forecasts SECTOR REPORT Next Generation Haptics: Market Analysis and Forecasts February 2011 Peter Crocker Lead Analyst Matt Lewis Research Director ARCchart

More information

(12) Patent Application Publication (10) Pub. No.: US 2005/ A1

(12) Patent Application Publication (10) Pub. No.: US 2005/ A1 (19) United States US 2005.0070767A1 (12) Patent Application Publication (10) Pub. No.: US 2005/0070767 A1 Maschke (43) Pub. Date: (54) PATIENT MONITORING SYSTEM (52) U.S. Cl.... 600/300; 128/903 (76)

More information

Paper Entered: November 4, 2014 UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD

Paper Entered: November 4, 2014 UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD Trials@uspto.gov Paper 34 571-272-7822 Entered: November 4, 2014 UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD CARL ZEISS SMT GMBH, Petitioner, v. NIKON CORPORATION,

More information

Invalidity Challenges After KSR and Bilski

Invalidity Challenges After KSR and Bilski Invalidity Challenges After KSR and Bilski February 24, 2010 Presenters Steve Tiller and Greg Stone Whiteford, Taylor & Preston, LLP 7 St. Paul Street Baltimore, Maryland 21202-1636 (410) 347-8700 stiller@wtplaw.com

More information

Paper 9 Tel: Entered: July 11, 2014 UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD

Paper 9 Tel: Entered: July 11, 2014 UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD Trials@uspto.gov Paper 9 Tel: 571-272-7822 Entered: July 11, 2014 UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD PANASONIC CORPORATION and PANASONIC CORPORATION OF NORTH

More information

United States Court of Appeals for the Federal Circuit

United States Court of Appeals for the Federal Circuit United States Court of Appeals for the Federal Circuit 04-1247 NELLCOR PURITAN BENNETT, INC. and MALLINCKRODT INC., v. Plaintiffs-Appellants, MASIMO CORPORATION, Defendant-Appellee. Robert C. Morgan, Fish

More information

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright E90 Project Proposal 6 December 2006 Paul Azunre Thomas Murray David Wright Table of Contents Abstract 3 Introduction..4 Technical Discussion...4 Tracking Input..4 Haptic Feedack.6 Project Implementation....7

More information

CANADA Revisions to Manual of Patent Office Practice (MPOP)

CANADA Revisions to Manual of Patent Office Practice (MPOP) CANADA Revisions to Manual of Patent Office Practice (MPOP) H. Sam Frost June 18, 2005 General Patentability Requirements Novelty Utility Non-Obviousness Patentable Subject Matter Software and Business

More information

Intellectual Property

Intellectual Property Intellectual Property Four Major Types of Intellectual Properties (US Law) Guard against the unauthorized use of. Trademarks Public Symbols & Markings Copyrights Names, Expressions & Publications Trade

More information

(1) A computer program is not an invention and not a manner of manufacture for the purposes of this Act.

(1) A computer program is not an invention and not a manner of manufacture for the purposes of this Act. The Patent Examination Manual Section 11: Computer programs (1) A computer program is not an invention and not a manner of manufacture for the purposes of this Act. (2) Subsection (1) prevents anything

More information

Network-1 Technologies, Inc.

Network-1 Technologies, Inc. UNITED STATES SECURITIES AND EXCHANGE COMMISSION Washington, D.C. 20549 Form 8-K CURRENT REPORT Pursuant to Section 13 or 15(d) of the Securities Exchange Act of 1934 Date of Report (Date of earliest event

More information

AN OVERVIEW OF THE UNITED STATES PATENT SYSTEM

AN OVERVIEW OF THE UNITED STATES PATENT SYSTEM AN OVERVIEW OF THE UNITED STATES PATENT SYSTEM (Note: Significant changes in United States patent law were brought about by legislation signed into law by the President on December 8, 1994. The purpose

More information

The below identified patent application is available for licensing. Requests for information should be addressed to:

The below identified patent application is available for licensing. Requests for information should be addressed to: DEPARTMENT OF THE NAVY OFFICE OF COUNSEL NAVAL UNDERSEA WARFARE CENTER DIVISION 1176 HOWELL STREET NEWPORT Rl 0841-1708 IN REPLY REFER TO Attorney Docket No. 300048 7 February 017 The below identified

More information

Introduction to Intellectual Property

Introduction to Intellectual Property Introduction to Intellectual Property October 20, 2015 Matthew DeSanto Assistant to Mindy Bickel, NYC Engagement Manager United States Patent and Trademark Office Outline Types of Intellectual Property

More information

Bas de Blank. Representative Engagements. Partner Silicon Valley T E

Bas de Blank. Representative Engagements. Partner Silicon Valley T E Practice Areas Intellectual Property U.S. International Trade Commission Patents IP Counseling & Due Diligence Trade Secrets Litigation Honors Top Verdict of the Year awarded by The Daily Journal and The

More information

Intellectual Property

Intellectual Property What is Intellectual Property? Intellectual Property Introduction to patenting and technology protection Jim Baker, Ph.D. Registered Patent Agent Director Office of Intellectual property can be defined

More information

Table of Contents Page I. BACKGROUND AND QUALIFICATIONS...1 A. Introduction...1 B. Experience...2 C. Publications and Presentations...3 D. Professiona

Table of Contents Page I. BACKGROUND AND QUALIFICATIONS...1 A. Introduction...1 B. Experience...2 C. Publications and Presentations...3 D. Professiona Petition for Inter Partes Review UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD Hutchinson Technology Incorporated Hutchinson Technology Operations Petitioners v. Nitto

More information

Technology Transfer and Intellectual Property Best Practices

Technology Transfer and Intellectual Property Best Practices Technology Transfer and Intellectual Property Best Practices William W. Aylor M.S., J.D. Director, Technology Transfer Office Registered Patent Attorney Presentation Outline I. The Technology Transfer

More information

AN OVERVIEW OF THE UNITED STATES PATENT SYSTEM

AN OVERVIEW OF THE UNITED STATES PATENT SYSTEM AN OVERVIEW OF THE UNITED STATES PATENT SYSTEM Significant changes in the United States patent law were brought about by legislation signed into law on September 16, 2011. The major change under the Leahy-Smith

More information

HOW TO READ A PATENT. To Understand a Patent, It is Essential to be able to Read a Patent. ATIP Law 2014, All Rights Reserved.

HOW TO READ A PATENT. To Understand a Patent, It is Essential to be able to Read a Patent. ATIP Law 2014, All Rights Reserved. To Understand a Patent, It is Essential to be able to Read a Patent ATIP Law 2014, All Rights Reserved. Entrepreneurs, executives, engineers, venture capital investors and others are often faced with important

More information

UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD. CISCO SYSTEMS, INC. Petitioner v. CHANBOND LLC Patent Owner

UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD. CISCO SYSTEMS, INC. Petitioner v. CHANBOND LLC Patent Owner Paper 13 Filed: May 17, 2017 UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD CISCO SYSTEMS, INC. Petitioner v. CHANBOND LLC Patent Owner Case IPR2016-01744 Patent 7,941,822

More information

Author Biographies. Rouget F. (Ric) Henschel and Michael D. Kaminski Chapter 1: The State of the Law of Claim Construction and Infringement

Author Biographies. Rouget F. (Ric) Henschel and Michael D. Kaminski Chapter 1: The State of the Law of Claim Construction and Infringement Author Biographies Rouget F. (Ric) Henschel and Michael D. Kaminski Chapter 1: The State of the Law of Claim Construction and Infringement Rouget F. (Ric) Henschel is a partner at Foley & Lardner LLP.

More information

The below identified patent application is available for licensing. Requests for information should be addressed to:

The below identified patent application is available for licensing. Requests for information should be addressed to: DEPARTMENT OF THE NAVY OFFICE OF COUNSEL NAVAL UNDERSEA WARFARE CENTER DIVISION 1176 HOWELL STREET NEWPORT Rl 02841-1708 IN REPLY REFER TO Attorney Docket No. 102079 23 February 2016 The below identified

More information

Capacitive Face Cushion for Smartphone-Based Virtual Reality Headsets

Capacitive Face Cushion for Smartphone-Based Virtual Reality Headsets Technical Disclosure Commons Defensive Publications Series November 22, 2017 Face Cushion for Smartphone-Based Virtual Reality Headsets Samantha Raja Alejandra Molina Samuel Matson Follow this and additional

More information

Appeal decision. Appeal No France. Tokyo, Japan. Tokyo, Japan

Appeal decision. Appeal No France. Tokyo, Japan. Tokyo, Japan Appeal decision Appeal No. 2015-1247 France Appellant Tokyo, Japan Patent Attorney Tokyo, Japan Patent Attorney ALCATEL-LUCENT LTD. OKABE, Yuzuru YOSHIZAWA, Hiroshi The case of appeal against an examiner's

More information

5/30/2018. Prof. Steven S. Saliterman Department of Biomedical Engineering, University of Minnesota

5/30/2018. Prof. Steven S. Saliterman Department of Biomedical Engineering, University of Minnesota Department of Biomedical Engineering, University of Minnesota http://saliterman.umn.edu/ Protect technology/brand/investment. Obtain financing. Provide an asset to increase the value of a company. Establish

More information

Research Collection. Comment on Henkel, J. and F. Jell "Alternative motives to file for patents: profiting from pendency and publication.

Research Collection. Comment on Henkel, J. and F. Jell Alternative motives to file for patents: profiting from pendency and publication. Research Collection Report Comment on Henkel, J. and F. Jell "Alternative motives to file for patents: profiting from pendency and publication Author(s): Mayr, Stefan Publication Date: 2009 Permanent Link:

More information

Speech, Hearing and Language: work in progress. Volume 12

Speech, Hearing and Language: work in progress. Volume 12 Speech, Hearing and Language: work in progress Volume 12 2 Construction of a rotary vibrator and its application in human tactile communication Abbas HAYDARI and Stuart ROSEN Department of Phonetics and

More information

Public Hearings Concerning the Evolving Intellectual Property Marketplace

Public Hearings Concerning the Evolving Intellectual Property Marketplace [Billing Code: 6750-01-S] FEDERAL TRADE COMMISSION Public Hearings Concerning the Evolving Intellectual Property Marketplace AGENCY: Federal Trade Commission. ACTION: Notice of Public Hearings SUMMARY:

More information

Appeal decision. Appeal No Tokyo, Japan Appellant MITSUBISHI ELECTRIC CORPORATION. Tokyo, Japan. Tokyo, Japan. Tokyo, Japan.

Appeal decision. Appeal No Tokyo, Japan Appellant MITSUBISHI ELECTRIC CORPORATION. Tokyo, Japan. Tokyo, Japan. Tokyo, Japan. Appeal decision Appeal No. 2012-23592 Tokyo, Japan Appellant MITSUBISHI ELECTRIC CORPORATION Tokyo, Japan Patent Attorney SOGA, Michiharu Tokyo, Japan Patent Attorney SUZUKI, Norikazu Tokyo, Japan Patent

More information

TEPZZ 879Z A_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION. (51) Int Cl.: G06F 3/0354 ( )

TEPZZ 879Z A_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION. (51) Int Cl.: G06F 3/0354 ( ) (19) TEPZZ 879Z A_T (11) EP 2 879 023 A1 (12) EUROPEAN PATENT APPLICATION (43) Date of publication: 03.06.1 Bulletin 1/23 (1) Int Cl.: G06F 3/034 (13.01) (21) Application number: 1419462. (22) Date of

More information

Patentability of Computer-Implemented Inventions in the field of Computer Security

Patentability of Computer-Implemented Inventions in the field of Computer Security Patentability of Computer-Implemented Inventions in the field of Computer Security Erik Veillas Patent Examiner, Cluster Computers European Patent Office TU München Munich, 21 June 2011 Acknowledgments

More information

UCF Patents, Trademarks and Trade Secrets. (1) General. (a) This regulation is applicable to all University Personnel (as defined in section

UCF Patents, Trademarks and Trade Secrets. (1) General. (a) This regulation is applicable to all University Personnel (as defined in section UCF-2.029 Patents, Trademarks and Trade Secrets. (1) General. (a) This regulation is applicable to all University Personnel (as defined in section (2)(a) ). Nothing herein shall be deemed to limit or restrict

More information

Trial decision. Conclusion The demand for trial of the case was groundless. The costs in connection with the trial shall be borne by the demandant.

Trial decision. Conclusion The demand for trial of the case was groundless. The costs in connection with the trial shall be borne by the demandant. Trial decision Invalidation No. 2014-800151 Aichi, Japan Demandant ELMO CO., LTD Aichi, Japan Patent Attorney MIYAKE, Hajime Gifu, Japan Patent Attorney ARIGA, Masaya Tokyo, Japan Demandee SEIKO EPSON

More information

i.e. v. e.g. Rule 1 during arguments: If you re losing, start correcting their grammar. - Author Unknown

i.e. v. e.g. Rule 1 during arguments: If you re losing, start correcting their grammar. - Author Unknown BIOTECH BUZZ Biotech Patent Education Subcommittee April 2015 Contributor: Jennifer A. Fleischer i.e. v. e.g. Rule 1 during arguments: If you re losing, start correcting their grammar. - Author Unknown

More information

What s in the Spec.?

What s in the Spec.? What s in the Spec.? Global Perspective Dr. Shoichi Okuyama Okuyama & Sasajima Tokyo Japan February 13, 2017 Kuala Lumpur Today Drafting a global patent application Standard format Drafting in anticipation

More information

United States Patent (19) [11] Patent Number: 5,746,354

United States Patent (19) [11] Patent Number: 5,746,354 US005746354A United States Patent (19) [11] Patent Number: 5,746,354 Perkins 45) Date of Patent: May 5, 1998 54 MULTI-COMPARTMENTAEROSOLSPRAY FOREIGN PATENT DOCUMENTS CONTANER 3142205 5/1983 Germany...

More information

Freedom to Operate (FTO) from a large company s perspective

Freedom to Operate (FTO) from a large company s perspective Freedom to Operate (FTO) from a large company s perspective Dr Stoyan A. Radkov - European Patent Attorney Novartis Pharma AG, Basel, Switzerland 11 October 2010 RSC, Piccadilly, London Overview What do

More information

Localized HD Haptics for Touch User Interfaces

Localized HD Haptics for Touch User Interfaces Localized HD Haptics for Touch User Interfaces Turo Keski-Jaskari, Pauli Laitinen, Aito BV Haptic, or tactile, feedback has rapidly become familiar to the vast majority of consumers, mainly through their

More information

Committee on Development and Intellectual Property (CDIP)

Committee on Development and Intellectual Property (CDIP) E CDIP/16/4 ORIGINAL: ENGLISH DATE: AUGUST 26, 2015 Committee on Development and Intellectual Property (CDIP) Sixteenth Session Geneva, November 9 to 13, 2015 PROJECT ON THE USE OF INFORMATION IN THE PUBLIC

More information

TEPZZ 7 Z_ 4A T EP A2 (19) (11) EP A2 (12) EUROPEAN PATENT APPLICATION. (51) Int Cl.: G06F 3/0488 ( ) G06F 3/0482 (2013.

TEPZZ 7 Z_ 4A T EP A2 (19) (11) EP A2 (12) EUROPEAN PATENT APPLICATION. (51) Int Cl.: G06F 3/0488 ( ) G06F 3/0482 (2013. (19) TEPZZ 7 Z_ 4A T (11) EP 2 720 134 A2 (12) EUROPEAN PATENT APPLICATION (43) Date of publication: 16.04.2014 Bulletin 2014/16 (51) Int Cl.: G06F 3/0488 (2013.01) G06F 3/0482 (2013.01) (21) Application

More information

Clarke B. Nelson, CPA, ABV, CFF, CGMA, MBA Senior Managing Director & Founder InFact Experts LLC

Clarke B. Nelson, CPA, ABV, CFF, CGMA, MBA Senior Managing Director & Founder InFact Experts LLC Curriculum Vitae Clarke B. Nelson, CPA, ABV, CFF, CGMA, MBA Senior Managing Director & Founder InFact Experts LLC cnelson@infact-experts.com Salt Lake City Office 175 South Main Street, Suite 630 Salt

More information

IN THE UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD AMAZON.COM, INC. & LENOVO (UNITED STATES) INC., - vs.

IN THE UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD AMAZON.COM, INC. & LENOVO (UNITED STATES) INC., - vs. Paper No. 1 IN THE UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD AMAZON.COM, INC. & LENOVO (UNITED STATES) INC., - vs. - Petitioners PRAGMATUS MOBILE LLC, Patent Owner

More information

Paper Entered: November 25, 2015 UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD

Paper Entered: November 25, 2015 UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD Trials@uspto.gov Paper 8 571-272-7822 Entered: November 25, 2015 UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD WANGS ALLIANCE CORPORATION d/b/a WAC LIGHTING CO., Petitioner,

More information

The below identified patent application is available for licensing. Requests for information should be addressed to:

The below identified patent application is available for licensing. Requests for information should be addressed to: DEPARTMENT OF THE NAVY OFFICE OF COUNSEL NAVAL UNDERSEA WARFARE CENTER DIVISION 1176 HOWELL STREET NEWPORT Rl 02841-1708 IN REPLY REFER TO Attorney Docket No. 300119 25 May 2017 The below identified patent

More information

ESTABLISHING A LEGAL MONOPOLY THROUGH PATENT LAW By Gold & Rizvi, P.A. The Idea Attorneys

ESTABLISHING A LEGAL MONOPOLY THROUGH PATENT LAW By Gold & Rizvi, P.A. The Idea Attorneys ESTABLISHING A LEGAL MONOPOLY THROUGH PATENT LAW By Gold & Rizvi, P.A. The Idea Attorneys PATENT BASICS In its simplest form, a patent is a legal monopoly granted by the United States Government to an

More information

Paper 24 Tel: Entered: February 8, 2017 UNITED STATES PATENT AND TRADEMARK OFFICE

Paper 24 Tel: Entered: February 8, 2017 UNITED STATES PATENT AND TRADEMARK OFFICE Trials@uspto.gov Paper 24 Tel: 571-272-7822 Entered: February 8, 2017 UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD ERICSSON INC. AND TELEFONAKTIEBOLAGET LM ERICSSON,

More information

11th Annual Patent Law Institute

11th Annual Patent Law Institute INTELLECTUAL PROPERTY Course Handbook Series Number G-1316 11th Annual Patent Law Institute Co-Chairs Scott M. Alter Douglas R. Nemec John M. White To order this book, call (800) 260-4PLI or fax us at

More information

73 Assignee: Dialight Corporation, Manasquan, N.J. 21 Appl. No.: 09/144, Filed: Aug. 31, 1998 (51) Int. Cl... G05F /158; 315/307

73 Assignee: Dialight Corporation, Manasquan, N.J. 21 Appl. No.: 09/144, Filed: Aug. 31, 1998 (51) Int. Cl... G05F /158; 315/307 United States Patent (19) Grossman et al. 54) LED DRIVING CIRCUITRY WITH VARIABLE LOAD TO CONTROL OUTPUT LIGHT INTENSITY OF AN LED 75 Inventors: Hyman Grossman, Lambertville; John Adinolfi, Milltown, both

More information