ASSOCIATE IMAGES TO I105}

Similar documents
(12) Patent Application Publication (10) Pub. No.: US 2016/ A1

(12) Patent Application Publication (10) Pub. No.: US 2007/ A1

(12) Patent Application Publication (10) Pub. No.: US 2006/ A1

(12) Patent Application Publication (10) Pub. No.: US 2015/ A1

(12) Patent Application Publication (10) Pub. No.: US 2008/ A1. Kalevo (43) Pub. Date: Mar. 27, 2008

(12) Patent Application Publication (10) Pub. No.: US 2015/ A1

(12) Patent Application Publication (10) Pub. No.: US 2002/ A1

lb / 1b / 2%: 512 /516 52o (54) (75) (DK) (73) Neubiberg (DE) (DK); Peter Bundgaard, Aalborg (21) Appl. No.: 12/206,567 In?neon Technologies AG,

(12) United States Patent

(12) Patent Application Publication (10) Pub. No.: US 2013/ A1

(12) Patent Application Publication (10) Pub. No.: US 2014/ A1

(12) Patent Application Publication (10) Pub. No.: US 2013/ A1. KM (43) Pub. Date: Oct. 24, 2013

(12) Patent Application Publication (10) Pub. No.: US 2005/ A1

(12) Patent Application Publication (10) Pub. No.: US 2016/ A1

(12) United States Patent

202 19' 19 19' (12) United States Patent 202' US 7,050,043 B2. Huang et al. May 23, (45) Date of Patent: (10) Patent No.

(12) Patent Application Publication (10) Pub. No.: US 2016/ A1

(12) Patent Application Publication (10) Pub. No.: US 2005/ A1. Chen et al. (43) Pub. Date: Dec. 29, 2005

(12) Patent Application Publication (10) Pub. No.: US 2017/ A1

(12) Patent Application Publication (10) Pub. No.: US 2007/ A1

(12) Patent Application Publication (10) Pub. No.: US 2007/ A1

(12) Patent Application Publication (10) Pub. No.: US 2005/ A1

WA wrippe Z/// (12) United States Patent US 8,091,830 B2. Jan. 10, (45) Date of Patent: (10) Patent No.: Childs

(12) Patent Application Publication (10) Pub. No.: US 2013/ A1

(12) Patent Application Publication (10) Pub. No.: US 2014/ A1

(12) Patent Application Publication (10) Pub. No.: US 2015/ A1

(12) Patent Application Publication (10) Pub. No.: US 2009/ A1. Alberts et al. (43) Pub. Date: Jun. 4, 2009

(12) Patent Application Publication (10) Pub. No.: US 2011/ A1

(12) United States Patent (10) Patent No.: US 6,346,966 B1

(12) Patent Application Publication (10) Pub. No.: US 2013/ A1. Bond et al. (43) Pub. Date: Oct. 24, 2013

(12) Patent Application Publication (10) Pub. No.: US 2015/ A1

(12) Patent Application Publication (10) Pub. No.: US 2010/ A1

(12) United States Patent (10) Patent No.: US 7.684,688 B2

(12) Patent Application Publication (10) Pub. No.: US 2016/ A1

(12) Patent Application Publication (10) Pub. No.: US 2016/ A1

(12) Patent Application Publication (10) Pub. No.: US 2010/ A1

United States Patent (19) [11] Patent Number: 5,746,354

(12) Patent Application Publication (10) Pub. No.: US 2013/ A1

(12) Patent Application Publication (10) Pub. No.: US 2003/ A1

(12) Patent Application Publication (10) Pub. No.: US 2012/ A1

United States Patent [19] [11] Patent Number: 4,696,400. Warman [45] Date of Patent: Sep. 29, 1987

(12) (10) Patent No.: US 7,226,021 B1. Anderson et al. (45) Date of Patent: Jun. 5, 2007

(12) Patent Application Publication (10) Pub. No.: US 2013/ A1

(12) United States Patent

(12) Patent Application Publication (10) Pub. No.: US 2016/ A1

(12) Patent Application Publication (10) Pub. No.: US 2012/ A1

(12) Patent Application Publication (10) Pub. No.: US 2006/ A1

(12) Patent Application Publication (10) Pub. No.: US 2003/ A1

(12) Patent Application Publication (10) Pub. No.: US 2014/ A1

(12) Patent Application Publication (10) Pub. No.: US 2013/ A1

(12) Patent Application Publication (10) Pub. No.: US 2006/ A1

(12) United States Patent

(12) Patent Application Publication (10) Pub. No.: US 2017/ A1

(12) Patent Application Publication (10) Pub. No.: US 2013/ A1. Chu et al. (43) Pub. Date: Jun. 20, 2013

United States Patent (19) Sun

(12) Patent Application Publication (10) Pub. No.: US 2007/ A1

US 9,470,887 B2. Oct. 18, (45) Date of Patent: (10) Patent No.: Tsai et al. disc is suitable for rotating with respect to an axis.

(12) Patent Application Publication (10) Pub. No.: US 2012/ A1

Attorney Docket No Date: 25 April 2008

(12) United States Patent

y y (12) Patent Application Publication (10) Pub. No.: US 2015/ A1 (19) United States (43) Pub. Date: Sep. 10, C 410C 422b 4200

(12) Patent Application Publication (10) Pub. No.: US 2011/ A1

(12) Patent Application Publication (10) Pub. No.: US 2005/ A1

(12) United States Patent (10) Patent No.: US 8,102,301 B2. Mosher (45) Date of Patent: Jan. 24, 2012

(12) Patent Application Publication (10) Pub. No.: US 2011/ A1

(12) Patent Application Publication (10) Pub. No.: US 2007/ A1

(12) United States Patent

(12) Patent Application Publication (10) Pub. No.: US 2011/ A1

(12) United States Patent (16) Patent N6.= US 6,371,848 B1

(12) Patent Application Publication (10) Pub. No.: US 2004/ A1

(12) United States Patent

E. A 'E. E.O. E. revealed visual indicia of the discard card matches the

(12) Patent Application Publication (10) Pub. No.: US 2009/ A1

(12) Patent Application Publication (10) Pub. No.: US 2012/ A1

(12) Patent Application Publication (10) Pub. No.: US 2001/ A1

(12) United States Patent

(12) Patent Application Publication (10) Pub. No.: US 2006/ A1

(12) Patent Application Publication (10) Pub. No.: US 2013/ A1. Saltzman (43) Pub. Date: Jul.18, 2013

(12) Patent Application Publication (10) Pub. No.: US 2013/ A1

(12) United States Patent (10) Patent No.: US 8,187,032 B1

(12) Patent Application Publication (10) Pub. No.: US 2016/ A1

us/ (12) Patent Application Publication (10) Pub. No.: US 2008/ A1 (19) United States / 112 / 108 Frederick et al. (43) Pub. Date: Feb.

(12) Patent Application Publication (10) Pub. No.: US 2015/ A1

(12) Patent Application Publication (10) Pub. No.: US 2016/ A1

(12) Patent Application Publication (10) Pub. No.: US 2004/ A1

(12) United States Patent (10) Patent No.: US 6,347,876 B1

(12) United States Patent (10) Patent No.: US 7,859,376 B2. Johnson, Jr. (45) Date of Patent: Dec. 28, 2010

58) Field of Seash, which is located on the first core leg. The fifth winding,

(12) Patent Application Publication (10) Pub. No.: US 2004/ A1

(12) Patent Application Publication (10) Pub. No.: US 2017/ A1

(12) Patent Application Publication (10) Pub. No.: US 2016/ A1

(12) United States Patent (10) Patent No.: US 6,593,696 B2

(12) United States Patent

(12) Patent Application Publication (10) Pub. No.: US 2011/ A1

(12) Patent Application Publication (10) Pub. No.: US 2012/ A1. T (43) Pub. Date: Dec. 27, 2012

US A1 (19) United States (12) Patent Application Publication (10) Pub. No.: US 2002/ A1 Huang et al. (43) Pub. Date: Aug.

R GBWRG B w Bwr G B wird

(12) Patent Application Publication (10) Pub. No.: US 2004/ A1

(12) Patent Application Publication (10) Pub. No.: US 2007/ A1

(12) Patent Application Publication (10) Pub. No.: US 2011/ A1

Transcription:

US 20140247283A1 (19) United States (12) Patent Application Publication (10) Pub. No.: US 2014/0247283 A1 Jo (43) Pub. Date: Sep. 4, 2014 (54) UNIFYING AUGMENTED REALITY AND BIG Publication Classi?cation DATA (51) Int. Cl. (71) ApplicantszGeun Sik Jo, lncheon (KR); Inha Industry Partnership Institute, lncheon G06T 11/60 G06T11/00 (2006.01) (2006.01) (KR) (52) US. Cl. (72) Inventor: Geun Sik Jo, lncheon (KR) CPC..... G06T11/60 (2013.01); G06T11/001 (201301) _ USPC..... 345/633 (73) Asslgnees: Geun Sik Jo, lncheon (KR); Inha Industry Partnership Institute, lncheon (KR) (57) ABSTRACT (21) Appl_ No. 1 4 2731,6170 Embodiments of the present invention relate to unifying aug mented reality technology and big data. An interactive opera (22) Filed; May 9, 2014 tion element may be de?ned. The interactive operation ele ment is associated With an event and a location on an (30) Foreign Application Priority Data augmented reality (AR) screen. An action may be performed based on the event using a prede?ned communication proto Jan. 5, 2013 (KR)..... 10-2013-0006014 col. The action may be associated With an information artifact Jul. 5, 2013 (KR)..... PCT/KR2013/006014 Which is derived from big data. IDENTIFY REAL-WORLD OBJECTS SUBJECT TO RECOGNITION I102 COLLECT AND STORE IMAGES I104 DERIVE AND ASSIGN INFORMATION ARTIFACTS f106 ASSOCIATE IMAGES TO I105} HOIVIOGRAPHY MATRIX PERFORM SEARCH FOR IMAGES OF REAL-WORLD OBJECT I110 DETERMINE PRECISE LOCATION f 112 FOR AUGMENTATION DISPLAY DISPLAY AND INTERACT WITH INFORMATION ARTIFACTS 11Ar

Patent Application Publication Sep. 4, 20 14 Sheet 1 0f 17 US 2014/0247283 A1 IDENTIFY REAL-WORLD OBJECTS SUBJECT TO RECOGNITION COLLECT AND STORE IMAGES I102 I104 DERIVE AND ASSIGN INFORMATION ARTIFACTS I106 ASSOCIATE IMAGES TO I105} HOMOGRAPHY MATRIX PERFORM SEARCH FOR IMAGES [110 OF REAL-WORLD OBJECT DETERMINE PRECISE LOCATION f 112 FOR AUGMENTATION DISPLAY DISPLAY AND INTERACT WITH INFORMATION ARTIFACTS I114 FIG.1 150 \ AUGIVIENTED REALITY (AR) ENGINE 152 STORAGE SYSTEM FIG. 2

Patent Application Publication Sep. 4, 2014 Sheet 2 0f 17 US 2014/0247283 A1 ( START ) INTERACTIVE OPERATION ELEMENT DEHNTHON UHZOZ 202A 2025 INTERACTIVE OPERATION ELEMENT REGISTRATION "204 204A 2045 204C 204D 204E COMMUNICATION PROTOCOL DEFINITION "206 206A 2065 206C 206D 206E ZO6F 206G ZO6H END FIG. 5

Patent Application Publication Sep. 4, 2014 Sheet 3 0f 17 US 2014/0247283 A1 Sm ESE Sm SSE ES QQLQ 5. Erma E2 SEE SSS? 71/ \/, BI 36% Jam. / mg 2 3 E... S; ///// _S EEQ 8:222 SJE 21:3 :9th was gas ES 9 :2 5:58 SEMS SEWS SE2 we : ES ZEEEMEEQE HE. E 0 ES: 5 2% SS H CE 53 US 255% u

Patent Application Publication Sep. 4, 2014 Sheet 4 0f 17 US 2014/0247283 A1 OQQ 888; 288881 88:88 289.88.82on.2 852:2.98: 88829888.2 852:2.98: 82; 8588 28922 8888a #88 B808 89 6828 2c: 228 8899.2 2888 9. - 888.88.82on 852:2. 8..22 d8: 82; 8888 28922 88:88 H88 2808 8828 2c: 3888 228 88292 880m 885-889.88 82on 852:2 8..29 d8: 88:82 82>.892 8825 8888a 859.9.888.892.2 852:2.08.98: m.qi 888: 8888a 88: 226 88992 - _8882 88922 - Cor=m :82 882.2 88992 -

Patent Application Publication Sep. 4, 2014 Sheet 5 0f 17 US 2014/0247283 A1 A..1288... 122...2888....228...."2 82288 A..1288....288....2888....228... 828$ A?.2282....128....2888....228...."2 208v A..1288... 1.22....2888....228... 822v 28288 A\...2222... 128....2888....228.... 2 288v A..1288....28..2888....228...."2 82v A\...2222... 128....2888....228.... 2 208v A..12o2.8...1288... 122....2228....228...."2 288v.QE A...22 8_...22 22m...2288....2888... 8.28v 2222288 A..22 8_.22 22@.2888....228."2 828v A..22 8_.22 22m.2888....228."2 828v.2288... 8228me A.. 2 >>2> 882.828 AscmEV 2282.828 A8208 882.828 882.828 A8202v 882.828 882.828 A82o2v A>>2>8828mv

Patent Application Publication Sep. 4, 2014 Sheet 6 of 17 US 2014/0247283 A1 Circle Circle MESSAGE_ID 100 : Hello Message (MID) 110 : ACK Hello Message 1 :KBS SOURCE (S) 2 I AR 1 :KBS DESTINATION (D) 2 I AR % 120: View Change Message 130 : ACK View Change Message 700 140 : Mouse Click Event Message 150 : ACK Mouse Click Event Message 160 : Mouse Over Event Message 170 : ACK Mouse Over Event Message 180 : Instruction Information Message 190 : ACK Instruction Information Message 200 : AR Control Message 201 : ACK AR Control Message 'SLIDE ID String (e.g. 1, 2, 3...) BUTTON NAME String (e.g. rv2_hsp1_bt1) MOUSE_OVER_TYPE 1 f'" 2. out INSTRUCTION NUMBER Strlng ISN-IO (e.g....) INS-01, INS-03 1 :Previous EVENT TYPE 2 I Next ACTION TYPE 1 :SHOW 2 : HIDDEN 3 : CLOSE FIG. 7

Patent Application Publication Sep. 4, 2014 Sheet 8 0f 17 US 2014/0247283 A1 850 f WHEEL AND BRAKE E7 FIG. 85 WHEEL AND BRAKE

Patent Application Publication Sep. 4, 2014 Sheet 9 0f 17 US 2014/0247283 A1 QQQ

Patent Application Publication Sep. 4, 2014 Sheet 10 0f 17 US 2014/0247283 A1 950 % FIG. 95

Patent Application Publication Sep. 4, 2014 Sheet 11 0f 17 US 2014/0247283 A1 QQQH $3352 <6? GI

Patent Application Publication Sep. 4, 2014 Sheet 12 0f 17 US 2014/0247283 A1 (UPPER) PIN MAIN FITTING / FIG.1OI5

Patent Application Publication US 2014/0247283 A1 66: 212202 <: GI

Patent Application Publication Sep. 4, 2014 Sheet 14 0f 17 US 2014/0247283 A1 f1150 @ é FIG. 115

Patent Application Publication US 2014/0247283 A1 QQNP $12202 <NF.QE

Patent Application Publication Sep. 4, 2014 Sheet 16 0f 17 US 2014/0247283 A1 I g G) gg (\1 0 0-2/ P 0 ICE I. \ \ / 9 LL

Patent Application Publication Sep. 4, 2014 Sheet 17 0f 17 US 2014/0247283 A1 1500 ELEMENT DEFINITION UNIT 4502 ELEMENT REGISTRATION UNIT 1504 PROTOCOL DEFINITION UNIT 4506 FIG. 15

US 2014/0247283 A1 Sep. 4, 2014 UNIFYING AUGMENTED REALITY AND BIG DATA CROSS-REFERENCE TO RELATED APPLICATION [0001] This application is the National Stage Entry of Inter national Application No. PCT/ KR2013/ 006014?led on Jul. 5, 2013 which claims priority from and the bene?t of Korean Patent Application No. 10-2013-0006014?led on Aug. 28, 2012, both of which are herein incorporated by reference for all purposes as if fully set forth herein. TECHNICAL FIELD [0002] Embodiments of the present invention relate gener ally to unifying augmented reality technology and big data. Speci?cally, the present invention relates to the interaction between a user and virtual objects within an augmented real ity environment. BACKGROUND [0003] Augmented reality (AR) is a live copy view of a physical, real-world environment whose elements are aug mented (or supplemented) by computer-generated sensory input such as sound, video, graphics, or GPS data. It is related to a more general concept called mediated reality, in which a view of reality is modi?ed by a computer. As a result, the technology functions by enhancing one s current perception of reality. By contrast, virtual reality replaces the real world with a simulated one. Augmentation is conventionally in real time and in semantic context with environmental elements, such as sports scores on television (TV) during a sporting event. With the help of advanced AR technology (e.g., adding computer vision and object recognition), the information about the surrounding real world of the user becomes inter active and canbe digitally manipulated. Arti?cial information about the environment and its objects can be overlaid on the real world. [0004] Big data is the term for a collection of data sets so large and complex that it becomes dif?cult to process using on-hand database management tools or traditional data pro cessing applications. The challenges include capture, cura tion, storage, search, sharing, transfer, analysis, and visual ization. The trend to larger data sets is due to the additional information derivable from analysis of a single large set of related data, as compared to separate smaller sets with the same total amount of data, allowing correlations to be found to spot business trends, determine quality of research, prevent diseases, link legal citations, combat crime, and determine real-time roadway traf?c conditions, and the like. SUMMARY [0005] Embodiments of the present invention relate to uni fying augmented reality technology and big data. An interac tive operation element may be de?ned. The interactive opera tion element is associated with an event and a location on an augmented reality (AR) screen. An action may be performed based on the event using a prede?ned communication proto col. The action may be associated with an information artifact which is derived from big data. [0006] A?rst aspect of the present invention provides an interactive method for unifying augmented reality (AR) and streaming video with big data, the method comprising the computer-implemented steps of: de?ning an interactive operation element, wherein the interactive operation element is associated with an event; associating the interactive opera tion element with a location on an augment reality screen; and performing an action based on the event using a communica tion protocol, wherein the action is associated with an infor mation artifact derived from big data, wherein the big data is collected through the Semantic Web. [0007] A second aspect of the present invention provides a computer program product for unifying AR and streaming video with big data, the computer program product compris ing a computer readable storage media, and program instruc tions stored on the computer readable storage media, to: de?ne an interactive operation element, wherein the interac tive operation element is associated with an event; associate the interactive operation element with a location on an aug ment reality screen; and perform an action based on the event using a communication protocol, wherein the action is asso ciated with an information artifact derived from big data, wherein the big data is collected through the Semantic Web. BRIEF DESCRIPTION OF THE DRAWINGS [0008] These and other features of this invention will be more readily understood from the following detailed descrip tion of the various aspects of the invention taken in conjunc tion with the accompanying drawings in which: [0009] FIG. 1 depicts an interaction method for unifying conventional augmented reality (AR) technology and infor mation artifacts; [0010] FIG. 2 depicts a conventional system for unifying conventional augmented reality (AR) technology and infor mation artifacts; [0011] FIG. 3 depicts a system for unifying conventional augmented reality (AR) technology and information artifacts according to an embodiment of the present invention; [0012] FIG. 4 depicts a tree structure representation for interactive operation elements according to an embodiment of the present invention; [0013] FIG. 5 depicts a table showing example interactive operation element properties and property attributes accord ing to an embodiment of the present invention; [0014] FIG. 6 depicts XML code illustrating an explicit connection between an information artifact and interactive operation element according to an embodiment of the present invention; [0015] FIG. 7 depicts an exemplary structure de?nition of a communication protocol according to an embodiment of the present invention; [0016] FIG. 8A depicts an initial communication method according to an embodiment of the present invention; [0017] FIG. 8B depicts a screen display of information artifacts according to an embodiment of the present inven tion; [0018] FIG. 9A depicts a communication method accord ing to an embodiment of the present invention; [0019] FIG. 9B depicts a screen switch to a submenu dis play according to an embodiment of the present invention; [0020] FIG. 10A depicts a communication method accord ing to an embodiment of the present invention; [0021] FIG. 10B depicts a display of an information artifact from a screen according to an embodiment of the present invention; [0022] FIG. 11A depicts an event message according to an embodiment of the present invention;

US 2014/0247283 A1 Sep. 4, 2014 [0023] FIG. 11B depicts a removal of an information arti fact from a screen according to an embodiment of the present invention; [0024] FIG. 12A depicts an event message according to an embodiment of the present invention; [0025] FIG. 12B depicts a popup window display accord ing to an embodiment of the present invention; and [0026] FIG. 13 depicts an example interaction system according to an embodiment of the present invention; [0027] The drawings are not necessarily to scale. The draw ings are merely schematic representations, not intended to portray speci?c parameters of the invention. The drawings are intended to depict only typical embodiments of the invention, and therefore should not be considered as limiting the scope of the invention. In the drawings, like numbering represents like elements. DETAILED DESCRIPTION [0028] While the system and method of the present appli cation is susceptible to various modi?cations and alternative forms, speci?c embodiments thereof have been shown by way of example in the drawings and are herein described in detail. It should be understood, however, that the description herein of speci?c embodiments is not intended to limit the invention to the particular embodiment disclosed, but on the contrary, the intention is to cover all modi?cations, equiva lents, and alternatives falling within the spirit and scope of the process of the present application as de?ned by the appended claims. [0029] The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of this disclosure. As used herein, the singular forms a, an, and the are intended to include the plural forms as well, unless the context clearly indicates otherwise. Furthermore, the use of the terms a, an, etc., do not denote a limitation of quantity, but rather denote the presence of at least one of the referenced items. It will be further understood that the terms comprises and/or comprising, or includes and/or including, when used in this speci?ca tion, specify the presence of stated features, regions, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other fea tures, regions, integers, steps, operations, elements, compo nents, and/or groups thereof. [0030] As indicated above, embodiments of the present invention relate to unifying augmented reality technology and big data. An interactive operation element may be de?ned. The interactive operation element is associated with an event and a location on an augmented reality (AR) screen. An action may be performed based on the event using a prede?ned communication protocol. The action may be associated with an information artifact which is derived from big data. [0031] Augmented reality (AR) is divided into marker based technology and non-marker based technology. In marker based augmented reality, an image including a par ticular image of a marker such as a black and white pattern or a barcode is recognized and a relative coordinate of an area in which the virtual object is to be displayed is determined, and the virtual object is displayed based thereon; whereas, in non-maker based augmented reality, an object within the image is directly identi?ed and related information is obtained. Since marker based approach can be included in the scope of non-marker based approach research, application of the present invention can be illustrated using point of inter est based technology which is one of the sub-methods of non-marker based approach. [0032] Augmented reality (AR) in the last decade has increased in popularity in various areas, such as education, advertising, maintenance, marketing, and entertainment. In the areas of maintenance and education speci?cally, the use of augmented reality can provide for the transfer of knowledge at a faster rate than other traditional methods. Additionally, the use of AR can help companies train their employees faster and better. The use of AR can also assist company employees in performing job tasks more competently and ef?ciently. [0033] An area that can bene?t from the use of AR is the maintenance of complex systems, such as aircraft mainte nance. To that end, three-dimensional (3D) or two-dimen sional (2D) graphics or images, text, or other media may be generated such that they are overlaid on and registered with surrounding objects in the environment. Applying AR to maintenance tasks could make it possible for users to be trained for those tasks, and actively assisted during their performance, without ever needing to refer to separate paper or electronic technical orders. Incorporating instruction and assistance directly within the task domain, and directly ref erencing the equipment at which the user is looking, may eliminate the need for maintenance employees to continually switch their focus of attention between the task and its sepa rate documentation. Use of AR may decrease the overall time of the repair, reduce errors in the repair process, and increase the knowledge of maintenance personnel. [0034] FIG. 1 depicts a high-level?ow diagram of a process for the conventional realization and operation of unifying augmented reality or streaming video with big data to enable interaction between a user and virtual objects within an aug mented reality or video environment. In step 102, real-world objects related to a speci?c scenario or sequential procedure (e.g., tools and parts used in aircraft maintenance) which are to be subject to recognition in an augmented reality environ ment are identi?ed. In other words, meaning situations where performance must follow a sequential procedure (e.g., air craft or submarine maintenance) are determined. In one example, the big data is collected through the Semantic Web. The Semantic Web is the extension of the World Wide Web that enables people to share content beyond the boundaries of applications and websites. Examples provided herein describe the process for unifying AR and big data. However, in other examples, big data may also be uni?ed with stream ing video. As used herein, streaming video includes any type of video that can be displayed on a video player or website using a web browser that supports streaming media. [0035] In step 104, images of the real-world objects are collected. In one example, images from various camera angles showing the front (representative image), left, right, top, and bottom of the real-world object are collected. The images are catalogued and stored in a database for easy retrieval. [0036] In step 106, information artifacts are derived and assigned to each real-world object subject to recognition. These artifacts may include a label, circle, button, thumbnail, video, image, text, or the like. The information artifacts are virtual objects that can be overlaid on the actual equipment being maintained which can signi?cantly improve the pro ductivity of individuals performing maintenance tasks. As used herein, information artifacts are derived from big data.