Tales from the frontier

Size: px
Start display at page:

Download "Tales from the frontier"

Transcription

1 Tales from the frontier Parallel Session 4 Chair: Attila Havas, Hungarian Academy of Sciences, Hungary Rapporteur: Clement Bezold, Institute for Alternative Futures, USA Table of Contents Paper 1 : QTIP: Quick Technology Intelligence Processes...2 Presentation 1 :...14 Paper 2 : The Scan Process and Technology Foresight...18 Presentation 2 :...32 Paper 3 : Tracing Emerging Irreversibilities in Emerging Technology : The case of nanotubes...42 Presentation 3 :...62 Paper 4 : Scenario-based Roadmapping A Conceptual View...64 Presentation 4 :...83 Paper 5 : Evaluation of Laboratory Directed Research and Development (LDRD) Investment Areas at Sandia...86 Presentation 5 :...99 Paper 6 : Dynamic monitoring of future developments SESSION 4 TALES FROM THE FRONTIER 1

2 Paper 1 : QTIP: Quick Technology Intelligence Processes Alan L. Porter 1 How long does it take to provide a particular Future-oriented Technology Analysis (FTA)? We traditionally perceived the answer calibrated in months, particularly for empirical technology analyses. This mindset contributes to many technology management or policy decisions relying primarily upon intuitive sources of knowledge. That need no longer be the case. This paper makes the case for quick text mining profiles of emerging technologies. I describe what we call "tech mining" -- deriving technology intelligence especially from R&D information resources. i,ii The phenomenon of interest is speed, but with provision of information that truly facilitates technology management. The time to conduct certain technology analyses can be reduced from months to minutes by taking advantage of four factors enabling QTIP -- Quick Technology Intelligence Processes: 1) instant database access 2) analytical software 3) automated routines, and 4) decision process standardization. The first QTIP factor concerns information availability. A defining characteristic of the "Information Economy" is enhanced access to information. Of particular note to FTA, the great science and technology (S&T) databases cover a significant portion of the world's research output. These databases can be searched from one's computer, enabling retrieval of electronic records in seconds. Many organizations have unlimited use licenses to particular databases that allow for thousands of records to be located and downloaded on a given topic at no additional costs. Various databases compile information on journal and conference papers, patents, R&D projects, and so forth. In addition, many researchers share information via the Internet (e.g., physicists increasingly post their papers at arxiv.org). Other databases cover policy, popular press, and business activities. These can be exploited to help understand contextual factors affecting particular technological innovations. All told, this wealth of information enables potent technological intelligence analyses. The second QTIP factor consists of expedited analyses using one form of "tech mining" software. This paper employs VantagePoint, but the specifics are less important than the principles. Namely, many aspects of data cleaning, statistical analyses, trend analyses, and information visualization can be done quite briskly. The third contributing factor, automated routines, makes a huge difference. As a loose analogy, consider the change from the hand-made automobile to the assembly line Model T Ford. Once we identify a set of analytical steps that we want to do repeatedly, we can script (write software programs or macro's) that automate those steps. Now the analyst devotes energies to refining results, presenting them effectively, and interpreting 1 1 Alan Porter is Director of R&D, Search Technology, Inc., Norcross, GA, USA 30071; phone: ; e- mail: aporter@searchtech.com. He is also Professor Emeritus, Georgia Tech, and co-directs the Technology Policy & Assessment Center there [//tpac.gatech.edu]. SESSION 4 TALES FROM THE FRONTIER 2

3 them. For instance, suppose we have a certain S-shaped growth model that we find highly informative for a particular family of technology forecasts. We now "push a button" to generate and plot such a model. We then inspect it, decide a different growth limit should be investigated, and "push the button" again. In a minute or so, we can examine several alternatives, select the one(s) for presentation, extrapolate to offer a range of future possibilities, and give our interpretation. The fourth factor profoundly changes the receptivity to empirical analyses. A major impediment to the utilization of FTA results is their unfamiliarity to managers and policy-makers. Today, major organizations are standardizing certain strategic technology and business decision processes. Stage-gate approaches set forth explicit decisions to be sequenced toward particular ends (e.g., new product development). Furthermore, we see organizations going the next step -- to require specific analyses and outputs at each stage. This facilitates the automated routines (factor three). But, even more importantly, it familiarizes users with data-based technology analyses The manager who gets the prescribed FTA outputs upon which to base particular technology management decisions comes to know them. (S)he develops understanding of their strengths and limitations, and, thus, how best to use this derived knowledge to make better decisions. In this way, technology intelligence gains credibility as a vital decision aid. The next section illustrates what it takes to produce composite empirical responses to particular technology management questions, quickly. I. Case Example: Solid Oxide Fuel Cells I confess -- this analysis was not done in the target time of "one day." Instead it derives from analytical work that has been ongoing for two years as illustrative material for a book (see Endnote 1). But, I would like to use the content to consider the four QTIP factors noted above to show how this work could, indeed, be done in a day. Fuel cells are the example technology. They convert hydrogen and oxygen into water, producing electricity and heat in the process. They function as an electrochemical device, like a battery. However, fuel cells are recharged with hydrogen and oxygen instead of electricity. Solid oxide fuel cells (SOFCs) are one of five major types of fuel cell. They typically use hydrocarbons as fuel and operate at high temperatures. SOFCs are well suited for applications such as power plants. Our QTIP setting presumes an organization with an established FTA framework. QTIP works where we know the sort of information we need. That implies "working back" from the decision support requirements to the data. It makes less sense for a "data mining" mindset in which we muck around in the data looking for interesting things that might be of interest, to someone, sometime. Suppose we have a systematized decision process that calls for answers to particular technology management questions. Imagine a scenario in which our organization is an American company initiating operations in Australia. These involve an innovation that needs a power supply for remote settings. We've already investigated technologies and SESSION 4 TALES FROM THE FRONTIER 3

4 determined that SOFC's appear most promising, but will need enhancement and customization. So, we'd now like to answer two questions: 1. Are there recent advances in SOFCs that we need to understand? 2. Might there be a suitable "Aussie" partner to work with us on this development? This established framework constitutes the fourth factor needed for QTIP -- decision process standardization. As we have been developing QTIP, 2 we have arrayed: 13 Management of Technology ("MOT") issues 39 MOT questions ~200 "innovation indicators" Innovation indicators are empirical measures rooted in models of how technological innovation proceeds. In our framework, these indicators reflect one of three main types of MOT information: technological maturation (life cycle status), contextual influences and market potential. iii The innovation indicators help answer MOT questions. Our list of some 200 indicators is not exhaustive, but it suggests particular indicators for each MOT question. One would adapt these to one's data sources and managerial concerns to posit particular indicators. We emphasize that pre-specifying the empirical indicators for selected MOT questions and issues is vital for QTIP to work. Having standard information dramatically enhances managerial receptivity: Standard information becomes familiar information. Familiar information becomes credible information. Credible information gets put to use in decision-making. Information that is used gets requested "next time." Information that is requested repeatedly merits automation of its generation. We next need to align the other three factors to enable QTIP -- database access, analytical tools, and scripting. Factor 1 is "instant database access." That is, access to the requisite information resources should be direct and seamless. The analyst should not have to work through an intermediary to search and retrieve electronic documents. This counters the current arrangements in many organizations. Often, an information services unit handles all data requests. In the past this typically meant that a researcher or analyst requested topical information from an information professional who sifted through the sources. (S)he eventually provided the requester with a very few documents to read. Such an arrangement fails at QTIP. However, we suggest that information professionals still have vital roles to play. Centralized information services are best positioned to arrange access to prime information resources. They can negotiate fair licenses that enable desktop access to the most useful and affordable databases. They can serve tremendously in showing others how to search using Boolean and other approaches, explaining database nuances, and by reviewing critical searches to suggest refinements. We suggest that information professionals consider expanding their skill sets to become expert on analytical tools and trainers on how to use these effectively. Unfortunately, our experience indicates that just providing the analytical tools to information professionals is unlikely to lead to 2 We gratefully acknowledge support of the U.S. National Science Foundation for "QTIPs Hour Technology Intelligence & Forecasting" (DMI ). SESSION 4 TALES FROM THE FRONTIER 4

5 effective FTA applications. On the other hand, expanded information professional roles could enable their becoming full FTA team members. iv One way or another, quick technology analyses must be done with a minimum of intermediaries. In our SOFC example, we use R&D publication abstract records from the Science Citation Index (SCI) and INSPEC, and patent abstracts from Derwent World Patent Index (DWPI). 3 Organizations access such databases various ways. For instance, Georgia Tech previously hosted key databases on its own server for access by students, staff, and faculty. Presently it accesses some databases via a state-wide consortium called Galileo, others directly using passwords through their internet sites, and some using CD-ROMs. At times the Technology Policy and Assessment Center at Georgia Tech has accessed such sources through a gateway service, Dialog. Whatever the route, the key is that, upon being alerted to a need to assess an emerging technology, the analyst can obtain the records immediately from such sources. In practice, this often entails a process like the following: perform a simple search using basic terminology -- e.g., look for papers containing the term "solid oxide fuel cell(s)" retrieve a sample of those records perform elementary analyses to gain a quick sense of what those records include get a subject matter expert to scan those results and suggest refinements to improve: - recall (to capture as much of the available information on the subject as possible) - precision (to minimize inclusion of extraneous information -- noise) refine the search (e.g., include synonymous terms such as the acronym, "SOFC," and exclude unwanted terms) download the resulting records covering the topic of interest. The resulting information for "tech mining" consists primarily of science and technology (S&T) publication and patent abstract records. Typically, these number from hundreds to tens of thousands. In the latter case, downloading may have to be piecewise. Nonetheless, the entire process can usually be completed in minutes. The complete search and retrieval process can often be completed in under an hour, contingent on how delicate the search specification and refining processes are. Those depend on the sensitivity of the MOT issues being addressed. In our SOFC example, one might imagine the first question (what's hot?) being less sensitive than the second (is this a suitable partner?). In this case, we actually began with searches on "fuel cells" in general. These had yielded 11,764 abstracts of journal and conference research papers gathered from the Science Citation Index and INSPEC, and 9,724 patent family records from DWPI. For the SOFC queries, we examined 1,286 of the publication abstracts that included SOFC in their title and 474 patent abstracts that mentioned SOFCs. Note that this is way too many documents to sensibly read and digest! So we turn to software tools to help "profile the R&D domain." v vi 3 We accessed data via Dialog, a leading gateway to over 400 different databases. We thank IEE, Thomson Derwent, and Thomson ISI, and Dialog for access to the fuel cell data. SESSION 4 TALES FROM THE FRONTIER 5

6 The second and third QTIP factors go hand-in-hand. Analytical software expedites profiling and discovery operations on the retrieved text resources. Scripting semiautomates the processing steps to achieve the desired information products. In this example I apply VantagePoint software 4 supported by Visual Basic macro's. Let's jump ahead and look at the results, then discuss how these are generated. Figure 1 presents the essence of a response to "what's hot in SOFCs?" This appears quite complicated. However, imagine that the target users of this information are used to seeing exactly this sort of profile in response when they ask "what's hot? in technology X?" They would recognize and understand each component indicator, and know what to look for. Figure 1. Technology "One-Pager" Accompanying this "one-pager" would be the analyst's interpretation. This could well point to action recommendations or posing of key choices. Furthermore, the further the analysis probes into advanced technologies, the more critical it becomes to obtain inputs by substantive experts. 4 VantagePoint has been commercially available since 2000 from Search Technology, Inc. [ Its development has been jointly sponsored by the U.S. Defense Advanced Research Projects Agency (DARPA) and the Army Tank-automotive and Armaments Command (TACOM). It is available for U.S. Government use as Tech OASIS. In addition, a commercial software tailored for use with the Derwent World Patent Index, Science Citation Index (Web of Knowledge), and Delphion patents is offered by Thompson Scientific as Derwent Analytics. SESSION 4 TALES FROM THE FRONTIER 6

7 Some points of note in Figure 1: The patent trends (upper left) show that overall SOFC activity is lively and increasing Patents are increasingly being secured in multiple patent authorities ("families" of multiple patents on the same invention), implying strong commercialization prospects The SOFC topic map (upper right, based on factor analysis of keywords appearing in multiple papers) shows an intriguing "cluster of clusters" in the upper region. We identify this as nano-surfaces and rare-earth materials -- "nano-combo" for shorter labelling. The publication trends (lower left) show this nano-combo of topics increasing strongly in contrast to the other clusters. The Hot Stuff? box (lower right) spotlights several indicators of technological advance: o publication and patent activity relating to the candidate hot topic ("nanocombo" -- nano-surfaces and rare-earth materials) is substantial o research, as measured by S&T publications, is hot o patenting, especially new (priority) patenting is much less recent o another indicator of how hot a research area is -- the ratio of conference to journal publication -- is relatively low for this sub-topic in comparison to the larger research domain [we might want to explore this discrepancy with our subject experts] o several terms appear for the first time in year 2002 publications [we could use these to stimulate discussion with our subject experts] o three companies each show 7 or more patents in this sub-topic, whereas no others hold more than 3 [these might warrant further investigation to see how their interests fit] Note that much of this information is best provided to subject-matter experts in our organization for their review and interpretation. This is quite likely to lead to another round of "tech mining" with them. The result of that would be suitable intelligence for senior technology managers to help determine next steps. One such step might be to probe whether we want to pursue joint development efforts with another organization. Figure 2 responds to the second question, more specifically profiling a particular candidate Australian organization with which to partner to meet our technological needs. Note that this example analysis does not focus on "nano-surfaces and rare-earth materials," but addresses SOFCs generally. Some points of note: The Scorecard (upper left) helps assess whether we seem to have a good fit. It is deliberately not quantified. Its components include judgmental aspects ("Tech Fit") as well as quantitatively based ones ("Tech Concentration" of their patenting in the target domain of SOFCs). Within the Scorecard, the Capabilities Spectrum synthesizes information to draw implications regarding this company's relative strengths Below the Scorecard, we sum up information gleaned from Ceramic Fuel Cells Ltd. website, deemed pertinent to deciding on their suitability as a development partner. Ceramic Fuel Cells' patenting trend (lower left) and patent citation tree (upper right) speak to the currency and importance of their intellectual property. SESSION 4 TALES FROM THE FRONTIER 7

8 The "knowledge network" map (lower central) depicts their inventors' collaborative activity, finding two teams (of which the trio in the upper left remains active). The table (lower right) indicates the top inventors and authors. From this digest of the company's open R&D face, we pose the action question for managerial decision. Figure 2. Organizational "One-Pager" Sources of information should be chosen to meet one's needs. I emphasize mining of R&D publication and patent abstract records. But note that we also tap internet sources -- here, for company information. In general, we find that the databases provide much richer S&T information resources with a measure of quality control. We like to use the internet to complement these by providing more up-to-date and contextual information. For instance, these analyses might point to key research centers; we could then seek their websites to learn more about their interests, contact information, etc. [In general, we prefer to first exploit the R&D databases, then update and probe using the internet.] Note that Figures 1 and 2 reflect distinctly different considerations. Figure 1 profiles technology development activity across multiple organizations. Figure 2 profiles one company's activity -- in this case for one technological development domain, SOFCs. Other variants of company profiling might compare the company's activities across technologies, or probe more deeply into a more specific sub-area (e.g., nano-surfaces and rare-earth materials for SOFCs). What they hold in common is a compilation of empirical information to help answer a particular MOT question. SESSION 4 TALES FROM THE FRONTIER 8

9 II. Discussion This paper illustrates how to compose informative decision support from empirical information concerning various facets of an emerging technology -- quickly. Collectively, the integration of the four QTIP factors results in a qualitative change in FTA. We know of a major corporation that reduced its time to provide a key set of competitive technological intelligence (CTI) analyses from 3 months to 3 days. With another firm, we have been exploring text mining tool applications. We mutually recognized that certain preliminary analyses could be done in 3 minutes, enabling refinement of information searches that would drastically upgrade subsequent FTA work. These two examples reflect an essential difference. The "3-day" QTIP addresses the technology information needs of end-users, such as senior technology managers or policy-makers. They would not be expected to perform the analyses themselves. In contrast, the second "3-minute" example indicates that others engaged in technology analyses have special needs too. The "quick" in this case serves the person performing the search and analysis. Design of QTIP tools and functions must address the diverse needs of all the players. "Process management" factors should be considered for all types of QTIP players: information providers (e.g., meeting their needs for profits and protection of their intellectual property) information professionals (e.g., in coordinating licenses and access to databases and analytical tools) technology analysts (e.g., power users of these capabilities on a regular basis) researchers, technologists, and some managers (e.g., occasional users of the databases and analytical tools) decision-makers (e.g., policy-makers and managers who weigh emerging technology considerations as either their main focus or as contributing factors, but do not perform the analyses personally). Process management calls for explicit attention to how the analyses and their outputs can best be organized to enhance utility. Technology analysts need to think beyond what constitutes valid and impressive analyses to what their target users want and what vii viii mechanisms can best communicate to them. A key principle is to maximize engagement and ongoing interaction of the QTIP players with each other. Recognition of the potential for speedy analyses should lead to rethinking the bases for technology management (MOT). Over the past decades, many management domains have come to rely quite heavily upon empirical evidence. For example, manufacturing process management used to depend completely on tacit knowledge. A supervisor spent decades gaining familiarity with his (or occasionally her) machines, people, and processes. He "knew" if something was not working right and initiated repairs accordingly. What could be better than this deep, personal knowledge? Well, it turned out that actual data were better. Compiling and making available performance histories for machines and processes enabled modern Quality Control (QC). When the potential was recognized, process managers realized that dramatic improvements in quality were possible. There would be no "Six Sigma" quality standards without empirical data and analyses thereof. SESSION 4 TALES FROM THE FRONTIER 9

10 Technology management, somewhat surprisingly, is among the least data-intensive managerial domains. One would think that scientists, engineers, and technology managers would naturally pursue empirical means to manage R&D and its transition into effective innovations. Not at all -- even in tracking our own performance, researchers strongly prefer peer judgment to bibliometrics. The technical community has a deep distrust of metrics. This poses an additional challenge to be overcome in implementing empirically informed technology management. Of course, many do use empirical information in S&T arenas. Researchers usually mine the literature to find a few "nuggets" that speak closely to their interests. Patent analysts traditionally sought the few key pieces of intellectual property. Tech mining offers qualitatively different capabilities. It can uncover patterns that reflect competitor strategies. ix It can also enable researchers and R&D managers to gain a global perspective on entire bodies of research. That can help position research programs and identify complementary efforts by others. On another level, the Dutch government allocates research support to universities based in part upon their publication records. Publications are weighted according to disciplinary journal impact criteria. Journal Citation Reports provide the basis for calculating the merits of individual and unit outputs. This is certainly not a foolproof system but it provides a more objective set of metrics than the "good old boy" peer review mechanisms. Certainly, this "tech mining" approach to quick technology analyses does not equally affect all forms of FTA. This paper explores the potential to expedite certain technological intelligence functions. "Tech mining" exploits the information compiled by S&T and other (e.g., business) databases. As such it represents one advanced form of technology monitoring. This information can serve other FTA needs to various degrees: Technology Foresight -- Quick tech mining can help participants grasp the scope of technology development efforts. Access to results in interactive mode (e.g., using the VantagePoint Reader software) enables digging down to locate specifics on a point of interest -- e.g., identifying an active researcher on a particular topic. Technology Forecasting -- QTIP can provide empirical measures for certain trend analyses to support growth model fitting and trend extrapolation. It can also help locate experts to engage in judgmental forecasting. Technology and Product Roadmapping -- QTIP serves background information roles well. It is vital in documenting external technology development activities to track their likely trajectories. Technology Assessment -- Again, QTIP can help scope the extent of R&D activities. Exploiting contextual information resources that cover policy, standards, public concerns, possible health and environmental hazards, and perceived technological impacts can further support TA activities. In sum, tech mining offers partial, but potentially very effective, support for these varied FTA endeavors. QTIP emphasize speed in generating technology analyses. Speed surely must be tempered by need. The sidebar vignette offers a realistic scenario of how this could unfold. The driver is "when do you need to have what information?" Note that this seriously alters relationships and expectations between manager-users and technology SESSION 4 TALES FROM THE FRONTIER 10

11 analysts. Particularly for academic researchers, we have an inclination to say "we can deliver a fine analysis; it will take two semesters to complete." Instead, the quick mindset has the user set the defining temporal parameter -- the deadline -- then we technology analysts fit into that schedule. Most importantly, this changed mindset opens up tremendous potentials for better informed MOT Sidebar: Hypothetical QTIP Vignette 8:00 am: The Vice-President for Research at Georgia Tech asks me to benchmark this university's SOFC research against the leading American universities for a presentation this noon. I get his suggestion on who, on campus, is active in fuel cells. We decide to focus on the last 5 years. He wants 3 PowerPoint slides like those we used last month in a similar benchmarking exercise. 8:05 am: I finish a quick Dialog "DialIndex" search that identifies which databases contain the most SOFC information. I select two that provide good coverage and are licensed for unlimited use by Georgia Tech. 8:10 am: I complete simple searches in SCI and EI Compendex, downloading 500- record samples of recent publication abstracts with SOFC in titles or keywords. 8:15 am: I import each search into VantagePoint and scan the keywords to ascertain if the search should be expanded to include other terms, or restricted to eliminate noise. Inspection of EI Compendex class codes helps determine whether classification-based searching should also be used. Perusal of the organizational affiliations of the authors suggests possible benchmark universities. 8:40 am: I search a compilation of Georgia Tech publication records to augment the VP's awareness of who is active in fuel cells. I check that my search strategy captures most of the Georgia Tech authored papers to help validate the query. 8:55 am: I phone around to find one local subject matter expert willing to review my search strategy to spot gaps or other weaknesses. Bill is available for a "3- minute" review before class. I my digest and we discuss on the phone. 9:00 am: I undertake the 'final' searches in SCI and EI Compendex and download hundreds of SOFC records for the most recent 5 years. 9:30 am: The records are imported into VantagePoint. A script runs data fusion and duplicate removal. An additional script profiles the leading researchers at each of the "Top 3 + Georgia Tech" American universities in the SOFC domain. A comparative 5-year trend script is run. Results are pasted from MS Excel into MS PowerPoint "GT Benchmarking" slide templates. 10:00 am: An auxiliary search is run on a U.S. Department of Energy R&D projects database for these four universities. A script generates a table showing overall DOE project activity that each university evidences on fuel cells. It generates pie charts showing how much each focuses on SOFCs out of its energy research. 10:20 am: Bill reviews the 3 PowerPoint slides, and notes that Georgia Tech has collaborated recently with a key researcher at one of the other universities. He notes that we have left out a key Georgia Tech SOFC researcher who leads many sponsored research projects on which open literature publication is not appropriate. 10:45 am: PowerPoints with interpretive comments, and a short background technical report, are provided to the VP SESSION 4 TALES FROM THE FRONTIER 11

12 This paper focuses on the idea that informative mining of S&T information resources can be done quickly and powerfully. Once that is accepted, extensive opportunities arise. The information resources are largely, but not completely, texts. "Text mining" tools are progressing rapidly. x xi These draw on both statistical and artificial intelligence approaches. Advanced entity extraction, query refinement, and elucidation of relationships based on text co-occurrence patterns can extend QTIP possibilities. Development of information visualizations especially for S&T offers great potential. xii xiii To close, this "new" method brings to bear available S&T information resources and analytical tools to generate FTA more quickly. Its novelty lies in the approach to technology analyses in support of technology management. To fully realize QTIP potential requires significant process management change: Systematize strategic business decision processes. Mandate explicit technology information products be provided for decision stages in such processes. Provide each researcher, development engineer, project manager, intellectual property analyst, etc. with direct, desktop access to a couple of most useful S&T information databases. Negotiate unlimited use licenses for those databases. License easy-to-use analytical software for all. Script the routine analytical processes. Develop standard output templates (information visualizations) Train the potential QTIP participants in use of the tools and resulting FTA outputs. But it's worth the effort. I am convinced that quick "tech mining" can dramatically improve MOT effectiveness. I would go so far as to forecast that the technology manager who relies solely on intuitive information faces extinction. The manager who incorporates data-based intelligence into decision processes will be better informed and that will lead to competitive advantage. SESSION 4 TALES FROM THE FRONTIER 12

13 References: i Porter, A.L., and Cunningham, S.W., Tech Mining: Technology Management through Information Mining, Wiley, New York, to appear. ii Teichert, T., and Mittermayer, M-A., (2002);"Text Mining for Technology Monitoring," IEEE IEMC 2002, p , iii Watts, R.J., and Porter, A.L.( 1997); Innovation Forecasting, Technological Forecasting and Social Change, Vol. 56, p ,. iv Newman, N.C., Porter, A.L., and Yang, J., (2001); "Information Professionals: Changing Tools, Changing Roles," Information Outlook, Vol. 5, No. 3, p , v Porter, A.L., Kongthon, A., and Lu, J-C.,( 2002); Research Profiling: Improving the Literature Review, Scientometrics, Vol. 53, p ,. vi Börner, K., Chen, C., and Boyack, K. W.,(2003); "Visualizing Knowledge Domains," Annual Review of Information Science and Technology, Vol. 37, p vii Porter, A.L., Yglesias, E., Kongthon, A., Courseault, C., and Newman, N.C., "Getting What You Need from Technology Information Products," Research- Technology Management, to appear. viii de Bruijn, H., and Porter, A.L., "The Education of a Technology Policy Analyst -- to Process Management," Technology Analysis and Strategic Management, to appear. ix Ernst, H., (2003); "Patent Information for Strategic Technology Management," World Patent Information Vol. 25, No. 3, p x Kontostathis, A., Galitsky, L.M., Pottenger, W.M., Roy, S., and Phelps, D.J., (2004) "A Survey of Emerging Trend Detection in Textual Data Mining," in Berry, M.W. (Ed.), Survey of Text Mining: Clustering, Classification, and Retrieval, Springer, New York, p xi See xii Chen, C., (2003); Mapping Scientific Frontiers: The Quest for Knowledge Visualization, Springer, London. xiii Shiffrin, R.M., and Borner, K., (2004) "Mapping knowledge domains," Proceedings of the National Academy of Sciences, Vol. 101 (Suppl. 1), p SESSION 4 TALES FROM THE FRONTIER

14 Presentation 1 : One Word Summary QUICK Technology Analyses in Minutes, not Months New FTA Methods Search Technology, Inc. Tech Mining 4 Step Process to Empirical S&T Intelligence Instant Database Access S&T publication abstracts e.g., Science Citation Index Patent abstracts e.g., Derwent World Patent Index Analytical Software [e.g., VantagePoint] Automated Routines [e.g., Visual Basic scripting] Decision Process Standardization [e.g., Stagegate processes] Session 4 TALES FROM THE FRONTIER 14

15 Tech Mining Framework 39 Technology Management Questions ~200 Innovation Indicators Growing # of Question-answering Templates New FTA Methods Search Technology, Inc. Example Quick Technology Intelligence: Solid Oxide Fuel Cells Technology or Organization focus Answer: What? & Who? questions One-pager indicator compilations New FTA Methods Search Technology, Inc. Session 4 TALES FROM THE FRONTIER 15

16 Compan y at a Glance: Ceramic Fuel Cells Ltd. SCORECARD Res Dev Mfg Com Low High Tech Fit Tech Coverage Tech Concentration Capabilities Spectrum Hungry? Heat Resistant Steel 2001 Sum itomo Disc Brakes Who uses their patents? Ceramic Fuel Cell, Ltd. U.S. Patents (select) Citation Tree Sarnof f SOFC Si em ens Hi Temp FC AHMED K Elec Interconnect for a planar fuel ce ll Fuel Cell Interconnect Device GenCell FC Stack Siem ens Hi Temp FC GenCell FC therm ally integrated reformer Ford M iniaturized SOF C Ceramic Fuel Cells Planar FC Allen Gngineering FC Current Collector Mitsubishi SOFC Ceramic Fuel Cells FC Interconnect Korea Inst of Energy Research SOFC Interconnect Gas Research Institute SOFC Intreconnect Top Inventors Top Authors 8 Investors in the Company: 12 Organizations, including Australian manufacturing, power, gas, investment & government. JAFF REY D Badwa l FO GER K Company Knowledge Network Jarrey [8] Foger [4] Badwal [2] Badwal [22] Jiang [22] Foger [16] 7 6 Company Patenting Trend ZHENG X G DONELS ON R Zhang [9] Ciacchi [8] 5 Nac hlas Love [6] 4 CHANAUD P Au to -Correlation Map In ven t ors (C leaned) (Clea ned ).. VP top links shown > (0) (0 ) (0 ) < (1) ELANGOVAN S HART VIGS EN J J KHANDKAR A C Next Step: Initiate contact with Foger or Badwal? One-Pager Components Scorecard for quick screening of alternatives Company Information from the www R&D Trends how active lately? Patent Impact who cites their patents? Inventor Teams knowledge networks Leading Inventors core players? still there? Action Recommendations so what? next step? New FTA Methods Search Technology, Inc. Session 4 TALES FROM THE FRONTIER 16

17 Quick Decision Support 1. Provide empirical technology answers in minutes 2. Change the expectations of policymakers and managers 3. Enable research domain profiling by and for researchers and research managers too 4. Support other forms of Futureoriented Technology Analyses New FTA Methods Search Technology, Inc. Resources Example outputs of our Technology Opportunities Analysis and HotTech profiling at: Information on the text mining software used here, VantagePoint, at: or tailored for Derwent data: Derwent Analytics: Tech Mining book by Alan Porter and Scott Cunningham, due in 2004 from Wiley New FTA Methods Search Technology, Inc. Session 4 TALES FROM THE FRONTIER 17

18 PAPER 2 : THE SCAN PROCESS AND TECHNOLOGY FORESIGHT Kermit M. Patton Research Director Scan SRI Consulting Business Intelligence THE PREMISE Predicting the future is impossible. The inherent unpredictability of technology development and commercialization processes means that highly structured technology plans based on market predictions and point forecasts can be limiting if not dangerous when organizations plan for new technologies. Maintaining the flexibility to accommodate constantly changing market dynamics has become essential in technology planning and foresight. Constant monitoring of the commercial, cultural, and technological environments is important in maintaining the needed flexibility. But most organizations have failed to accomplish such monitoring in a consistent or systematic way. This paper describes two tools that help organizations maintain accurate technology foresight. SRI Consulting Business Intelligence s (SRIC-BI s) Scan process is an organizational group process that provides organizations with the means to monitor the market environment continuously for signals of change that influence technological and other developments. SRI International s Structured Evidential Argumentation System (SEAS) is a software system that integrates large amounts of data in a collaborative environment to provide early alerts to decision makers. SEAS uses automated evidence-based reasoning to produce early conclusions about potential technological opportunities or risks facing an organization. (See Structured Evidential Argumentation System.) Technology-development uncertainties have proliferated during the past decade as a result of the increasing complexity and turbulence in the marketplace in which modern technologies now succeed or fail. Globalization, privatization, deregulation, competition, and an acceleration of the advances in science and technology all have further complicated an already complex business environment. The commercial environment is constantly emerging from the interactions of thousands of variables from market-driven pricing processes to government regulations, from consumer opinion to market competition, from international trade flows to the development of new materials that defy comprehension, let alone quantitative analysis and prediction. As complexity increases, successful businesses will be those that turn themselves into adaptive systems that work in an organic manner to find, capture, interpret, and act on cues from an ever-changing environment. Stephan Haeckel, author of Adaptive Enterprise, notes, Where organizations choose to place their sensory probes and how they distinguish signals from random noise determine whether they will be sufficiently aware of what is happening out there. The marketplace is an ever-changing, turbulent confluence of commercial, societal, and technological factors. The most important tools for remaining afloat in the turbulence are a constant awareness of the changes going on around your organization and the ability to sense, make sense of, and adapt to these changes. Session 4 TALES FROM THE FRONTIER 18

19 Navigating and monitoring uncertainty and turbulence is a process that benefits from the application of disparate tools. Both Scan and SEAS are valuable tools in this context, but their differences are significant. SRIC-BI s Scan process depends heavily on human cognition and pattern-recognition capabilities, group discussion, brainstorming, creativity, ideation, and humor. As Ray Kurzweil points out in his book The Age of Spiritual Machines, the bulk of human neural circuitry excels at patternrecognition functions. The human mind has an enviable capacity to scan the thousands of variables operating in the turbulent marketplace, sort the results, and extract patterns on which to base decisions for future actions. The Scan process provides a framework with which we can regularly and systematically marshal the patternrecognition capabilities of a group of professionals to identify important changes in the business environment. THE PROMISE The organizations that survive today s marketplace turbulence will be those that can adapt rapidly to change. The organizations that thrive in today s turbulence will be those that live for change, are constantly aware of developments emerging beyond their own particular domain, and recognize oncoming threats in time to turn them into opportunities. The management literature is replete with admonitions to pay attention not only to competitors but also to external factors, discontinuities, and signals of change. The very title of Andy Grove s (former chairman and CEO of Intel) management book Only the Paranoid Survive trumpets the premise that a necessary practice of successful managers is the constant, furtive glance over the shoulder to avoid being blindsided by circumstances or competitors. Dorothy Leonard-Barton, in her book Wellsprings of Knowledge, maintains that the most important streams of knowledge for any company are not internal but rather those that flow in from outside the company. Simple awareness of signals of change is insufficient in and of itself to provide an organization with a competitive edge. A futures orientation among decision makers is necessary to take advantage of foreknowledge of change. Eric D. Beinhocker and Sarah Kaplan talk of creating prepared minds...so that executives have a strong grasp of the strategic context they operate in before the unpredictable but inevitable twists and turns of their business push them to make...critical decisions in real time ( Tired of Strategic Planning?, McKinsey Quarterly). But the management literature is short on practical solutions for methodically gleaning early signals of change from the surroundings or for cultivating a futures orientation in employees and managers. The companies that currently incorporate externalities well usually depend on a leader at the top of the corporation who performs the scanning function on a continual basis, has an inherent futures orientation, and imports the knowledge he or she develops into the decision-making process intuitively. The Scan process is a tool for collecting early signals of change and for nurturing a futures orientation more broadly in an organization. Session 4 TALES FROM THE FRONTIER 19

20 THE PROCESS Scan The Scan process that this section of the paper describes is, in fact, a continuous loop that repeats endlessly or certainly has done so for the past 25 years. The text describes a beginning and an end point of the process, but the process is in fact continuous, with many parts going on simultaneously and in concert. The constant surveillance on the part of scanners, the rhythm of monthly Scan abstract meetings, and the deadlines of client deliverables (in which Scan findings appear) are part of the normal, continuous cycles of working at SRIC-BI. The process has created an awareness on the part of SRIC-BI employees of the constancy and importance of change in the business environment and the world at large. SRIC-BI employees come to expect the appearance of discontinuities and become familiar with how they play out in a variety of industries and domains. The analysts and researchers learn that change rather than stability is the coin of the commercial realm, and metamorphosis is the common currency. The beginning point to describe the Scan process is the collection of a set of data points from the business, cultural, and technological environments. The data points can be events, developments, opinions, findings, or products that our researchers and analysts believe to be early signals that portend significant changes. Other ways of describing what we re looking for include: Signals of change Discontinuities Outliers (events or developments that are off the current trend line) Items that defy conventional wisdom Inflection points Disruptive developments or technologies. Our scanners cast their nets broadly to bring in signals of change from various domains, including: Politics Regulation Culture Consumer behavior Public opinion Business processes Science Technology. Session 4 TALES FROM THE FRONTIER 20

21 The breadth of scope inherent in the diversity of the categories represents one of the most important strengths of the Scan process. The turbulence of the marketplace (and the fate of particular technologies in the marketplace) consists of the confluence of factors from all these categories. And foreseeing the fate of particular technologies in the marketplace depends on reading the early signs from and interactions among all these categories (see Figure 1). Organizations that focus on their own industry and areas of expertise will miss important signs from the broader business, cultural, and technological environments. Figure 1 THE COMPLEX MARKET ENVIRONMENT Source: SRI Consulting Business Intelligence (SRIC-BI) Each month SRIC-BI employees enter more than 100 abstracts describing signals of change into an online Scan database. The Web-based entry form includes fields to cite the source of the signal, to summarize whatever the abstractor believes is important about the article or event (which may have nothing to do with the original author s premise), and to suggest implications for the business environment. The computer system assigns each abstract a reference number. The database is text searchable, and users can collect individual abstracts from any month into sets of any number of abstracts by date, topic, scanner, or source. At the beginning of each month, the database administrator finalizes the current month s set of abstracts and directs the continuing stream of incoming abstracts to begin the next month s set. What makes a good Scan abstract? New employees at SRIC-BI generally participate in Scan for six months before managing to submit consistently good Scan abstracts. Developing an intuitive appreciation for unusual patterns and learning to distinguish between truly innovative developments and those simply repackaged by promoters can take a considerable amount of time and discipline. An employee s first submission Session 4 TALES FROM THE FRONTIER 21

22 might concern a new microchip from Intel that is twice as fast as and half the size of its predecessor. But that microchip is directly on the projected development curve of microprocessors. Moore s law predicted that chip 20 years ago not quite what we re looking for. After six months, the same employee might submit an abstract on the development of a microchip that contains all the software necessary to run a minimal Web server a much more interesting development in terms of potentially enabling small or portable devices to serve as Web servers. One recent Scan abstract described how Tetsuya Tada, chief engineer for the development of one of Toyota s recent concept cars, complains that young people today pay more attention to cell phones than to cars. The abstract s implication is that cars compete in the marketplace with cell phones, an apparently absurd assumption given that the prices and functions of the two items are so dramatically different. But if we, just for the moment, entertain the assumption as a possibility, we become aware of the possibility that products in today s highly competitive environment are increasingly competing with products outside their category for the attention of the consumer. In an attention economy, products compete with every other product on the market. If a company wants to attain mind share in a large segment of the consumer market, concentrating on flash instead of function when making technology decisions may be an appropriate strategy. The abstract is valuable because it questions conventional wisdom and broadens the reader s concept of competition beyond the traditional bounds. Each month s set of 100-plus abstracts serves as the starting point for an open-ended discussion and brainstorming session by analysts, researchers, managers, sales and marketing staff, and consultants. Half of the meetings each year are open to client observation or participation. SRIC-BI s staff in Croydon, England, hold a bimonthly Scan meeting in addition to the monthly meeting at company headquarters in Menlo Park, California. The meetings consist of two parts. The first part is facilitated rather than led and takes the form of a free-floating discussion of any of the Scan abstracts that participants find provocative, interesting, disturbing, or important. The facilitator discourages judgmental, idea-killing behavior and steers the discussion clear of extended exchanges of opinion or philosophical discussions. Politics and philosophy are definitely fair game, but arguments simply waste the group s time. Frequent calls for new clusters of abstracts or discussion topics are necessary to mine the month s abstracts as thoroughly as possible for signals of change. The facilitator makes certain the discussion stays reasonably close to the abstract data points in order to make sure the meeting doesn t degenerate into a discussion unrelated to client needs. The Scan meeting facilitator urges participants to identify clusters of abstracts. This paper describes just two of the countless ways Scan can lead to valuable insights. In the first method, a cluster of several abstracts can characterize a conceptual overlay that a client organization can lift off the Scan data and apply to its own processes, products, or services. This type of clustering allows companies to gain ideas from other industries or product domains. Figure 2 demonstrates this type of overlay or conceptual pattern. Three abstracts from different areas air-quality assessment, Session 4 TALES FROM THE FRONTIER 22

23 health care, and the auto industry demonstrate new applications of continuous monitoring. Continuous monitoring, as a concept, is not new of course thermostats use the basic principal to control the temperature of rooms. But the three abstracts demonstrate how new networking, computing, and sensing technologies are dramatically expanding the capabilities of and domains in which continuousmonitoring concepts can operate. An awareness of such new capabilities will serve as a jumping off point for generating ideas for new technology-based products and services. Figure 2 CONCEPTUAL OVERLAYS Source: SRIC-BI The second method operates in a cross-category manner to help Scan researchers identify the defining forces that are operating in the business environment. When abstracts on particular topics (such as wireless technologies or privacy concerns) constitute clusters that cross industry-domain categories (such as health, education, information technology, retailing, and government), the analysts know that the technology or topic will have widespread impact (see Figure 3). Session 4 TALES FROM THE FRONTIER 23

24 Figure 3 DEFINING FORCES Scan abstracts typically cluster around industry categories. Abstracts about Health Care Wireless Technologies Abstracts about Advertising and Retailing Potential Defining Forces Privacy Abstracts about Manufacturing Abstracts about Information Technologies and the Internet Abstracts about Education Source: SRIC-BI The second part of the meeting consists of identifying the topics and clusters from among the results of the brainstorming session that bear further analysis and research for potential presentation to client organizations. Participants place the topics on a very rough ranking spectrum between actionable and speculative simply as a means of preserving a perspective on a time frame for a projected impact. Clients observing or participating in the Scan meetings for the first time frequently comment on the fact that the meetings are dramatically different from meetings that they typically experience in a corporate setting. The meetings are nonhierarchical because participants know the process works best if everyone values each other s contributions regardless of rank in the organization. The meetings are relatively selfregulating because the participants have sufficient experience to know what kind and what level of discussion and participation are most productive. The meetings include a wide variety of expertise and backgrounds, from technology to specialties in consumer behavior, from engineering to anthropology, and from management to marketing. Following the Scan meetings, the filtering process that identifies valuable ideas and knowledge for client organizations begins in earnest. The Scan researchers and analysts carefully examine the clusters of abstracts and potential topics that surfaced during the meeting. The analysts compare the new topics and ideas to ones the Scan process has previously identified, and they probe and test the ideas and topics for their substance, plausibility, and potential implications. The analysts finally conduct some quick research to gather more evidence. Particularly time-sensitive topics immediately become Signals-of-Change documents that see circulation to clients within the month. Topics requiring further research or analysis and longer explications become topics for Session 4 TALES FROM THE FRONTIER 24

25 Scan s white papers, which the program calls Insights and which have a longer development and publishing schedule. The program s Scan Monthly presents four new Signals of Change each month and four new Insights each month. The Players Because we are interested in a wide variety of perspectives in the abstract-collection process and the Scan meeting, we solicit participation from Researchers and analysts Technology monitors Strategy consultants Principal consultants Marketing and sales staff. SRIC-BI hires from a wide variety of academic and professional backgrounds including anthropology, business, economics, international affairs, communication arts, marketing, life sciences, and chemical and electrical engineering. Employees from all levels of the organization, from CEO on down, participate in the process of submitting abstracts and attending Scan abstract meetings. The Scan abstract sets include abstracts from SRIC-BI staff in SRIC-BI s Tokyo, Japan, and Croydon, England, offices, providing a global perspective. Employees participate primarily on a voluntary basis because creative or proactive thinking is difficult if not impossible to mandate. The Scan program is interested in having people participate who are interested in participating. Scan experience on the part of participants is highly valuable for the Scan process. Learning what constitutes a good Scan abstract can take six months to a year of attending Scan meetings. Learning to identify truly unique clusters of abstracts can take a year or more. Experience on the part of Scan meeting participants also makes for a smooth meeting, with participants pacing the introduction of new topics themselves rather than depending on the facilitator to set the pace. A consistent set of attendees at Scan meetings also establishes a memory for the meetings so that topics and ideas don t appear repeatedly unless new developments merit a resurfacing of the topic. The Product The most important product of the Scan process, either through subscription to Scan or through customized implementation within a company, is an increased awareness on the part of planners, employees, and managers of the importance of a heads-up attitude about the external environment. The likelihood of technology plans being blindsided by external developments increases every year with the increasing complexity and competition in the business environment. The Scan process provides a language, infrastructure, and mind-set for cultivating a future orientation in any organization. Session 4 TALES FROM THE FRONTIER 25

26 In order to distribute a future orientation throughout client companies Scan offers clients both push and pull distribution mechanisms. A pull mechanism, in which employees in the client company can pull content from the Scan Web site as the need arises, is a necessity in today s fast-paced environment. To meet this need, the searchable Scan archive is available 24-7 to researchers and clients. A push mechanism, in which the Scan program pushes content into distribution within the client company, is necessary because the topics and questions that Scan regularly surfaces are not typically on the radar screen or agenda of client companies. The Scan Monthly serves the push function by highlighting four new Signals of Change each month that have come out of the latest Scan meeting and announcing the availability of four new Insight documents that further explore the implications of previous signals of change. Each month s Signals of Change are also available to subscribing clients in HTML format on the Scan Web site. Recent Signals of Change include: The End of Actuarial Medicine? Neuromarketing Animate, Inanimate, or Neither? Automating Research Bettor Predictions escience China s Global Designs The United States of Asia Brand Exhaustion Beauty Medicine and the Worried Well E-Commerce Ecosystems. SRIC-BI has no patent on the Scan process and has the consulting expertise and experience to assist companies in creating their own internal scanning systems. Part of that experience consists of an awareness of the hurdles companies typically face in attempting to implement the Scan process internally. Hurdles include: Hierarchical meetings. The presence of a senior manager can inhibit the discussion and stifle innovative ideas and input. Junior employees don t want to risk looking bad. Senior managers must understand their role (refraining from normal decisionmaking, judgmental behavior patterns), and junior staff must feel comfortable in expressing themselves. Accountantitis. Given the opportunity, the accountants will want immediate documentation of a return on investment for the cost of the meetings. Premature evaluation. The Scan process benefits from experienced participants, so the early meetings can seem ambiguous, unfocused, and unproductive. Naysayers. The Scan process is particularly susceptible to tunnel-visioned naysayers who focus on this quarter s earnings. One naysayer can deflate an entire Session 4 TALES FROM THE FRONTIER 26

27 room of energized, creative, innovators. Selection of appropriate personality types for participation in the process is the most important success factor in implementing a Scan process. Low priority. To sustain the process beyond six months requires a strong commitment from the organization to make the process work and to use the results in planning and decision making. Although the Scan process serves most effectively as an early-warning system, clients have found it helpful in other ways. Among the uses are as a form of peripheral vision (to avoid being blindsided by events outside one s industry), as an input to innovation processes, as a strategic stimulant, as a strategic irritant, and as a means of questioning the conventional wisdom or complacency within an organization. Through the years, Scan has played an essential role in our clients technology foresight by providing a systematic means for surveying the broad external environment for change vectors. The monitoring process in most organizations is largely arbitrary, depending on what concerned individuals in the organization are reading, thinking about, and sharing informally with each other. But in today s world, arbitrary is insufficient. No foresight function can operate with confidence without a disciplined process for spotting new patterns or change and bringing those issues into the organization to discuss. STRUCTURED EVIDENTIAL ARGUMENTATION SYSTEM The signals of change that organizations respond to are typically complex or uncertain but potentially critical for their future. Successful performance thus requires dealing not only with identifying and assessing new signals on a timely basis, but also with channeling the signal issues into appropriate response modes such as options initiation, crisis management, and ongoing monitoring. To address the needs for ongoing monitoring in the government-intelligence community, the U.S. Defense Advanced Research Projects Agency (DARPA) sponsored SRI International to develop the Webbased tool SEAS for monitoring of important crisis topics. SEAS records an analyst s thinking about new intelligence in structured arguments, so that results are easier to understand and compare. SEAS is especially useful for facilitating decisions that must rely on results from complex collaborative-intelligence efforts. SRI developed the initial design of SEAS in 1990 for a state-owned oil company to provide early alerts relative to project-management plans for complex oil and gas facilities. SRI later generalized SEAS for DARPA to support crisis warning for national security. SEAS has been in active use in several U.S. intelligence agencies since 2001, and today, SEAS tools and methods can serve in any situation where regular monitoring of intelligence is necessary. By providing transparent, credible, early alerts to decision makers, SEAS allows effective response to changing situations. In this way, SEAS can enhance technology monitoring and forecasting efforts by leveraging its automated evidence-based reasoning system to integrate large amounts of data and provide clear early conclusions about potential technological opportunities or risks facing an organization. Session 4 TALES FROM THE FRONTIER 27

28 Because SEAS is Web based, it allows simultaneous access to a broad number of users, thus promoting essential collaboration and peer review. The Web server approach also supports a corporate knowledge base from which users can retrieve past analyses and learnings. How SEAS Works SEAS is an evidence-based reasoning system, which means that it uses structured arguments or specific lines of reasoning that relate evidence to conclusions. Argument templates provide the structure in the form of a hierarchy of questions for monitoring a topic and developing conclusions. The answers to upper-level questions in the hierarchy derive automatically from the answers to lower-level supporting questions. The questions at the lowest, base level are answered directly by an analyst. The role of the analyst who is building an argument (that is, monitoring a topic) is to answer as many questions as possible and to attach supporting evidence together with a description of rationales for the way the questions were answered (see Figure 4). SEAS acknowledges that desired information or evidence on a topic is often unavailable and thus does not require all answers before it develops a conclusion. Instead, SEAS continuously updates the current beliefs of the analyst, given the supplied information. Session 4 TALES FROM THE FRONTIER 28

29 Figure 4 EXAMPLE OF SEAS DESIGN: STRUCTURED ARGUMENTS, TEMPLATES, AND GRAPHICAL DEPICTIONS OF CONCLUSIONS Source: SRI International A key feature of the tool is that it presents the answers and conclusions of the argument graphically using a traffic-light metaphor, with a red light indicating a major development, a yellow light indicating some warning signs, and a green light indicating that no problems are evident. Results are readily understandable and are transparent in that warning signals are easy to trace to key evidence. Another key feature is that SEAS allows representation of numerous inputs and opinions. The evidence used by an analyst to support answers may come from a variety of information sources such as published articles, Web pages, s, interviews, observations, and another SEAS argument and can be changing or Session 4 TALES FROM THE FRONTIER 29

30 uncertain. Analysts assess and record the relevance and quality of the data. They may flag newly created evidence to alert other users to new information. An analyst wanting to share results or collaborate with others uses SEAS accesscontrol features to allow others to alter or just view the argument. When satisfied with an argument and its conclusions, the analyst or group of analysts can publish it to an audience and to the corporate memory. Publishing is an element of control over the argument and guarantees that an argument is always available in an unaltered form. Advantages of SEAS One strength of the SEAS tool is that it encourages disciplined, thoughtful, and timely analysis of disparate information by introducing a structure. In addition, SEAS implementation as a Web server enables easy collaboration across the company and leverage of past results and methods. The structured-argumentation methodology stimulates the analyst to consider and monitor the full spectrum of indicators. It also supports the analyst in doing his or her main task: analyzing the evidence and developing conclusions from that evidence. By providing transparent, credible early alerts to decision makers, SEAS allows effective responses to changing situations. SEAS can also foster strategic alignment. The key to success is for the template architect to spend sufficient time at the beginning to frame the problem correctly and to ask the right questions. The architect needs to break the problem down into a hierarchically structured set of interrelated questions, with the highest-level questions representing the company s strategic focus. The base-level questions to be answered by the analyst should be quantitative and phrased in such a way that two analysts sharing a common understanding of a situation would choose the same answers. Commercial applications of SEAS are now in implementation. The first version of the Web-based architecture was complete in 1999, and multiple versions have since followed. Several U.S. intelligence agencies have deployed SEAS. SRI and SRIC-BI are actively working with a number of companies to apply SEAS in a variety of monitoring applications. Suggested Reading Related to SEAS Lowrance, John D., Harrison, Ian W., and Rodriguez, Andres C. (2001) Capturing Analytic Thought. Proceeding of the First International Conference on Knowledge Capture, October, pp Mishra, S. and Rodriguez, A. and Eriksen, M. and Chaudhri, V. and Lowrance, J. and Murray, K. and Thomere, J. (2002) Lightweight solutions for user interfaces over the WWW. Proceedings of the International Lisp Conference. Lowrance, John D. and Harrison, Ian W. and Rodriguez, Andres C. (2000) Structured Argumentation for Analysis. Proceedings of the 12th International Conference on Systems Research, Informatics, and Cybernetics: Focus Symposia on Advances in Computer-Based and Web-Based Collaborative Systems (Baden-Baden, Germany), Aug, pp Session 4 TALES FROM THE FRONTIER 30

31 Lowrance, J.D., Garvey, T.D., and Strat, T.M., (1990) A Framework for Evidential- Reasoning Systems. Uncertain Reasoning, Ed. Shafer and Pearl, Morgan Kaufman Publishers, Inc., pp Lowrance, J.D., (1988) Automated Argument Construction. Journal of Statistical Planning and Inference, vol Session 4 TALES FROM THE FRONTIER 31

32 Presentation 2 : Session 4 TALES FROM THE FRONTIER 32

33 Session 4 TALES FROM THE FRONTIER 33

34 Session 4 TALES FROM THE FRONTIER 34

35 Session 4 TALES FROM THE FRONTIER 35

36 Session 4 TALES FROM THE FRONTIER 36

37 Session 4 TALES FROM THE FRONTIER 37

38 Session 4 TALES FROM THE FRONTIER 38

39 Session 4 TALES FROM THE FRONTIER 39

40 Session 4 TALES FROM THE FRONTIER 40

41 Session 4 TALES FROM THE FRONTIER 41

42 Paper 3 : Tracing Emerging Irreversibilities in Emerging Technology : The case of nanotubes Rutger van Merkerk and Harro van Lente 5 Abstract `In general, Technology Assessment (TA) approaches seek to monitor and analyse technological and social developments in order to articulate options for intervention. One of the general weaknesses in these endeavours is the difficulty to map and analyse the early phases of technical-scientific fields, where claims of potential benefits abound, but where activities are still more science than technology. The early phases show a great deal of fluidity and open ends, while the routes that emerge may nevertheless lead to significant future rigidities (in terms of technologies, applications and stakeholders). This paper contributes to the development of methods for mapping and understanding the fluidic situation of emerging technologies. Our key concept is the notion of irreversibilities that emerge in the ongoing activities of researchers, institutes, policy makers and firms. Emerging irreversibilities denote the first socio-cognitive patterns that decrease the fluidity and openness, and that, eventually constrain and enable future activities. To trace the emerging irreversibilities we focus on the dynamics of expectations and the agenda building processes. A three-level framework is presented to analyse and visualize the dynamics in three interrelated contexts: the level of the research groups, the technological field and the society. We will take nanotechnology as a subject of study to develop and apply our method. Nowadays, nanotechnology is an important topic for technology firms, policy makers and research institutes. Many countries and firms feel the need to explore and stimulate its promised possibilities. Concomitantly, there is concern that social, economic, environmental and health issues also need consideration. In other words, to cope with these topics, technology assessment of nanotechnology is needed. But it is not straightforward how such an assessment could be achieved, since nanotechnology is in the early stages of development. In addition, nanotechnology is an umbrella term that covers a wide diversity of topics and research areas. Therefore, in order to explore the possibilities of technology assessment, a further focus is needed on a particular area of nanotechnology, in which the first signs of applied science (technology) and (forthcoming) commercial activity are present. We will concentrate on the application of nanotubes in electronic devices (technological field) and in particular on the development of non-volatile memories based on the use of nanotubes in semiconductor technologies. By applying the three-level framework dynamically, it is possible to trace emerging irreversibilities as these shape the ongoing activities in the early stages of development. To conclude, we will discuss how the analysis of early dynamics is a vital ingredient of technology assessment studies that, indirectly (by means of the involved actors), seeks to influence the technological development at stake. By placing the constructive technology assessment (CTA) approach in a 5 Department of Innovation Studies, Utrecht University, The Netherlands, P.O. Box 80125, 3508 TC, Utrecht, The Netherlands r.vanmerkerk@geog.uu.nl and h.vanlente@geog.uu.nl Session 4 TALES FROM THE FRONTIER 42

43 historical perspective of technology assessment, we will show the relevance of our method for constructive technology assessment studies. I. INTRODUCTION Nanotechnology is a rising star in the set of new and emerging technologies. Many countries and firms feel the need to explore and stimulate its possibilities. The future of nanotechnology has become an important topic for technology firms, policy makers and research institutes. Typically, when new technologies emerge, they are accompanied by promises of all sorts. Earlier examples are biotechnology, genomics and microelectronics, or, more general, ICT. The media tell us, for example, that the new technology will definitely change our lives. It is a so-called generic technology (OECD, 1992) that can be used in all kinds of products and production processes, and thus, will have an impact in all areas of economic activities (examples are the materials production industry, pharmaceutical industry, electronics industry). Although nanotechnology is still in its exploration phase, industry, governments and research institutes already have high stakes in the future application. It is estimated that governments and large firms invested over $2 billion in nanotechnology worldwide in 2002 (Arnall, 2003). Concerns and issues on the other hand, accompany the emergence of nanotechnology. For instance, it is not certain that these investments will have the intended results. Promising technologies do not automatically deliver what they promise and the outcomes may differ from what was expected. Who will pay the bill of unsuccessful implemented or unwanted technology? In addition, there is a growing concern about the environmental, health and societal impacts of nanotechnology. In this paper we will address the issue how useful assessments can be made, given the enormous, but intrinsic uncertainties. Our basic claim is that in order to appreciate and to influence developments in new emerging technologies, an understanding of the dynamics is necessary. As nanotechnology is the - partly intended, partly unintended - outcome of the moves of many actors in industry, research and policy, we need insight into the emerging patterns and mechanisms. We will discuss the phenomena and develop a method for tracing them empirically; important here are the dynamics of expectations and the processes of agenda building. We will develop a three-level framework, which enables to trace the dynamics of expectations and agenda building in detail and employ it in our case study we will show that it is possible to trace emerging irreversibilities for a specific application of nanotubes (section IV). We will conclude by placing our contribution in a historical perspective of technology assessment and by discussing the relevance of our method for constructive technology assessment, or CTA. 6 6 Constructive technology assessment studies of nanotechnology are at the moment being performed in the Netherlands. These studies are part of a Dutch research and development programme that coordinates the efforts of leading research institutes and companies in the Netherlands in the area of nanotechnology. The preceding informal network was formed in 2001 and recently, in November 2003, it received a substantial funding of 95 Million Euro by the Dutch government, as well as its official name: NanoNed. An integral part of the NanoNed programme (3% of the budget) is the assessment of social, political, economical and environmental/health issues. Similar projects and funding can be seen in the other industrialised countries as well. Session 4 TALES FROM THE FRONTIER 43

44 II. THE HOPES AND FEARS OF NANOTECHNOLOGY Nanotechnology is defined as the ability of controlled manipulation at the nanoscale (1-100nm) (1nm is approx. 1/80,000 of the thickness of a human hair) in order to create revolutionary new materials and systems that relate directly to the nanoscale. The ability to control matter at such small length scales got a big push by the development and improvement of a variety of microscopes (e.g., the atomic force microscope, AFM 7 ) in the mid-eighties, which made the visualisation of the atomic region more and more accessible for scientists. One of the first landmarks is the Nobel Prize discovery of a new carbon molecule containing sixty carbon atoms (C 60 ) in the shape of a ball in 1985 (also called a bucky ball) (Kroto, et. al., 1985). Nanotechnology is seen as an enabling technology, which means that it enables the industry to improve their products, but will not likely make products on its own. Nanotechnology can, for example, enable precise targeting of drugs (pharmaceuticals) or make computer screens flexible (electronic industry). In this paper we focus on a special kind of nanosized particle, the carbon nanotube. The term nanotube is generally used to refer to the carbon nanotube, which can be visualised as a sheet of chicken wire, which is rolled up into a cylinder where the loose wire ends seamlessly join (fig. 1). In the remainder of this paper we will use the term nanotube instead of carbon nanotube. Figure 1: rolling up a sheet of carbon atoms (graphene) will result in a single-walled carbon nanotube. The promising developments of nanotubes, and nanotechnology in general, have led, at least according to some analysts, to a nanotechnology hype. Various images about nanotechnology were brought into the world by media, spokespersons, etc., that sketches the seemingly unlimited possibilities that nanotechnology has to offer. Typical examples are very small robots that can conduct operations inside the human body or an elevator into space based on a nanotube cable. While these examples may be farfetched, they feed expectations by various actors in society (e.g., the public, politicians, firms). On the other hand there are growing concerns about the 7 IBM researchers (G. Binnig & H. Rohrer) received the Nobel Prize for their discovery of the scanning tunnelling microscope (STM). This microscope used an ultra fine tip to scan materials atom by atom and is therefore a powerful tool to investigate structures at the nanoscale. This discovery was the beginning of a whole range of microscopes achieving the same precision, but with different methods (e.g., AFM). These later developed microscopes are also capable of manipulating matter atom by atom. Session 4 TALES FROM THE FRONTIER 44

45 development of nanotechnology. NGOs and the media became aware of nanotechnology and addressed their concerns. Here again, we see topics that relate to the very far and speculative future such as nano-systems that control (and reproduce) themselves, and immediate concerns that are based on today s science, such as toxicological effects of nanoparticles. A well-known problem with any attempt to address such concerns is the dilemma pointed out by Collingridge (1980). When a technology is in the early stages of development it is very hard to foresee the social implications of the technology, but the course of development can still be altered easily. When the technology becomes part of our economic and social system, social implications can be easily determined. However, changing and controlling the technological development then becomes increasingly difficult. This is what Collingridge called the dilemma of control. The approach of constructive technology assessment (CTA) addresses the dilemma of control (Schot & Rip, 1997), as it aims at assessing technology development at an early stage, when changes are still viable. CTA studies intend to broaden the development process of the emerging technology by incorporating heterogeneous actors. The approach aims at a reflective process among these actors, and, eventually, at social learning (Smits & Leyten, 1991). Consequently, this might lead to changes in the development at an early stage and supports the social embedding of the technology. III. METHOD: TRACING EMERGING IRREVERSIBILITIES In nanotechnology, the stakes and the expectations are high for various heterogeneous actors. Yet, as in all emerging technologies, the situation is very fluid, unpredictable and no actor has clear knowledge what the technology will bring. Research institutes study a broad variety of scientific subjects and some results will be seen as promising and some not. The promising results and outlooks reshape the expectations for further research and eventually the research agendas (Van Lente, 2000). At the same time more institutes will start to work on the same subject, there is more attention in the journals, conferences on the subject are organised, etc. That is, some routes are becoming less visible and probable, while others will gain more support and strength. As a result, there will be less fluidity in the emerging structures and actors will experience less available choices due to diminishing variety and decisions taken earlier (Callon, 1995). These emerging irreversibilities reduce the complexity of the situation (Rip & Kemp, 1998). The concept of irreversibility indicates processes that cannot be undone once they occur. One-way chemical reactions are a clear example. Likewise, when sociotechnical patterns become irreversible it indicates that the patterns become less fluid over time and that socio-technical structures emerge. In a strict sense, they can by undone, but only with increasing costs and pain. The adjective emerging denotes a likelihood of a phenomenon which is not a true irreversibility yet, to become irreversible in due course. Emerging irreversibilities are the first indicators of a new Session 4 TALES FROM THE FRONTIER 45

46 socio-technical structure, and, therefore, they are worthwhile to explore. At this stage of research, it is not possible to make a complete list of emerging irreversibilities and their indicators. Instead, we will investigate the notion of emerging irreversibilities and develop a method to trace emerging irreversibilities. 8 The growing attention of a certain subject, for instance, can be seen as an indicator for the emerging irreversibility. Figure 2 shows the growing attention in journals and indicates that the term nanotubes was increasingly used in the titles of scientific articles (extracted from the PiCarta database) Figure 2: number of times Nanotube is mentioned in article titles (PiCarta). In 1999 a new specialised journal, the Journal of Nanoparticle Research, is established. This indicates the crystallisation of a new scientific community. The new outlet for publication on a new topic, and the early definition of a new audience indicate a next step in an emerging structure. While this step is reversible, in principle, it will be hard to undo in practice, because it has changed the perception and routines of researchers and has shaped expectations of a new audience. 9 A second example of emerging irreversibilities are roadmaps, which can be seen as articulated expectations about which path a company or an industry (as in the case of chip manufacturing) should follow for a certain period, e.g. 10 years. The fact 8 A phenomenon, which is not explicitly dealt with in this paper, is strategic behaviour of the various actors involved in the technological development. However, we do subscribe the importance of strategic behaviour in the daily lives of the actors. This paper deals with the outcomes of this behaviour, but not with the behaviour itself. 9 The fact that we use the emergence of this specialised journal for this paper is the fact that nanotubes is one of the major topics in this journal. What Roco (1999:1) states as: Research contributions on nanoparticles, clusters, nanotubes, nanocrystals, nanolayers, and macromolecules surrounded either by gases, liquids or solids, are brought together in this single publication. Session 4 TALES FROM THE FRONTIER 46

47 that these roadmaps are made is an indication that actors involved in this process link up to reach a common goal. The path - as is written down in the roadmap - is the expression of the shared expectation that this is the right way to go. The roadmap, thus, functions as a device to keep the actors together. To deviate from it can only be done with increasing costs, and this is exactly what indicates an emerging irreversibility. The question, then, is how to trace emerging irreversibilities? We will focus on two ingredients of technological change that are especially important in the early stages: (i) the expectations that guide the search activities of scientists and firms, and (ii) the processes of agenda building (Van Lente, 2000; Van Lente and Rip, 1998). Expectations have important roles in technological development. Since all involved actors - scientists, firms, policy makers - have to act under the condition of insufficient information, they will depend on the shared expectations that are present. Expectations shape the mindsets of the various actors, while, in their turn, expectations will be shaped and reshaped by research results, findings in other technical fields, or external forces. In general, expectations mould variation processes and guide selection (Van Lente, 1993). Likewise, in processes of agenda building variation is further reduced as certain topics are selected as important and urgent. Expectations are translated into the agendas of the different actors, upon which they act. The agendas give rise to activities and different outcomes (e.g., scientific results, a collaboration between companies), which will evoke new, often more specific expectations and agenda building. 10 We distinguish various levels where variation and selection occur: (i) locally, within a firm or research group, (ii) more general, within a technical-scientific field and (iii) more global and diffuse, in society at large (see Figure 3). The vertical dimension shows the three levels of aggregation. The first level deals with research groups. Here, research is done at very specific subjects (e.g., the electromechanical properties of nanotubes). The second level refers to a technological field (e.g., nanotubes in electronic devices). The third level relates to the societal level (e.g., nanotubes as part of nanotechnology), where governments, NGO s and other societal actors articulate the social, political and economic aspects of the new technological field. The developments in nanotechnology also reflect on the developments on nanotubes and vice versa. The horizontal dimension distinguishes between different core areas of technological activity: basic research and research for market applications. Note that the levels are dynamic and interrelated. Each level will have its own timescales, though: changes at societal level have a slower pace than at the level of individual research groups (Rip and Kemp, 1998). Figure 3 shows what questions should be answered in order to address the particular cell in the matrix and Figure 4 gives typical sources to address the questions. 10 Earlier one of the authors has analysed this ongoing dynamic as the promise-requirement-cycle. (Van Lente, 2000). Alternatively, one could focus on emerging networks of actors and artefacts: the preferred entrance point of actor network theory (Callon 1995, Latour, 1987). Then, the analyst would trace the emerging concentrations in actor-networks (e.g., firm cooperation, joint research efforts), as indicators of emerging irreversibilities. Session 4 TALES FROM THE FRONTIER 47

48 Society Technological field (Research) group Basic research Nanotubes as part of nanotechnology How are the technological developments of nanotubes from the scientific viewpoint taken up by society? Nanotubes in electronic devices What are, from a scientific point of view, the different options and focus points for nanotubes in electronic devices? Nanotubes used in non-volatile memories What are the results from (academic) research groups that contribute to the realization of non-volatile memories based on nanotubes? Market Nanotubes as part of nanotechnology How are the technological developments of nanotubes from the market viewpoint taken up by society? Nanotubes in electronic devices What are, from a market point of view, the different options and focus points for nanotubes in electronic devices? Nanotubes used in non-volatile memories What are the results of private companies that contribute to the realization of non-volatile memories based on nanotubes? Figure 3: questions that are raised in order to address the dynamics of expectations and processes of agenda building. Society Technological field (Research) group Basic research Nanotubes as part of nanotechnology Reports by NGO s Reports by government agencies Spokesperson statements Nanotubes in electronic devices Review articles that give an overview of the developments in the field Nanotubes used in non-volatile memories Articles in scientific journals Market Nanotubes as part of nanotechnology Reports by NGO s Reports by government agencies Spokesperson statements Nanotubes in electronic devices Reports that translate technological developments into market potentials Articles addressing the market potentials of technological developments Nanotubes used in non-volatile memories Press releases of individual firms Articles that address the developments and potentials of applications Figure 4: possible sources to answer the questions raised in figure 4. The case we discuss in the next section deals with non-volatile memories based on nanotubes. We chose this application of nanotubes in electronic devices, because it is as will become apparent a dynamic case among the (still) very few applications of nanotubes in electronic devices, which also shows some commercial activities. Session 4 TALES FROM THE FRONTIER 48

49 IV. CASE: NONVOLATILE MEMORIES BASED ON NANOTUBES A promising application of nanotubes is to use them as electromechanical 11 components in non-volatile memories 12. Non-volatile means that the data remains when the power of the electronic device is turned off. For example, for personal computers, you can continue your work where you left it the previous day (like turning on the television), because the information is still present on the memory (this is called instant booting). In 2000, Rueckes et. al. (Charles Lieber s group, Harvard university, Cambridge, Massachusetts) published the architecture (fig. 5) of how to make these nonvolatile memories based on the suspended SWNT crossbar (proof of principle). In the off state (fig 6), the nanotubes have a certain distance between them. The lower nanotube is semiconducting, the upper nanotube is metallic 13. The metallic nanotube will bend towards the perpendicular semiconducting nanotube when both are electrically charged (electromechanical process). The nanotubes will then stay in this position due to the Van der Waals forces 14. These forces cause the nanotubes to remain their position, even when the power is turned off, giving the memory its nonvolatile character. The positions can be determined by measuring the resistance (directly related to the flow of electrons) between the nanotubes. In the on state the resistance is much lower, which allows determination between zero or one. By making an large array of these crossbars (every crossbar represents a bit) it is possible to make a memory chip. 11 Electromechanical means that an electrical current can induce mechanical movement. 12 Memory, which is commonly referred to as RAM (Random Access Memory), is a temporary (volatile) storage area utilized by the processing unit of every personal computer. Before a program can be used, the program is loaded from the hard drive into the memory, which allows the processing unit (processor) to directly access the program. The reason for this process is that hard drives are too slow to directly run programs from and therefore the program is temporarily (as long as you use it) loaded into fast memory. 13 Different materials have a different resistance towards the conduction of electricity. Most metals (e.g., iron or copper) conduct electricity very well and are therefore used for wiring to transport electricity from one place to another. Insulating materials (e.g., most plastics) conduct electricity very poorly and are therefore used to shield wires from the environment. Semiconducting materials conduct less well than metals, but better than insulators. Transistors (the basic building block of computer chips) are made from semiconducting material. Nanotubes can have both properties, which depends on the geometry of the single-walled carbon nanotube (Odom et. al., 1998). 14 The Van der Waals forces are the physical forces of attraction and repulsion existing between molecules, which are responsible for the cohesion of molecular crystals and liquids. Van der Waals forces act only over relatively short distances and the forces are important in the mechanics of adhesion. Session 4 TALES FROM THE FRONTIER 49

50 Figure 5: architecture of suspended nanotube memory. Source: Rueckes et. al., Figure 6: on and off states of a suspended SWNT crossbar. Source: Rueckes et. al., The architecture explained above is the ideal solution. However, there are limitations in realizing this architecture (explained below) and therefore a hybrid solution has been developed and patented by Nantero (US Patent No ). The intention is to commercialise this technology as soon as possible. In the hybrid solution the lower nanotube is replaced by a semiconducting structure created by common lithography techniques 15. Then a layer of nanotubes is deposited and the unwanted nanotubes are etched away (again with common lithography). G. Schmergel, T. Rueckes and B.M. Segal founded Nantero in 2000 (Rueckes being one of the inventors of the proof of principle). Nantero is developing NRAM TM a high-density non-volatile random access memory chip using nanotube technology. The company expects to deliver a product that will replace existing forms of memory, such as DRAM, SRAM and flash memory, with a high-density non-volatile universal memory (Nantero, 2003b). This type of memory can be used in a wide variety of electronic devices (PCs, digital cameras, MP3 players, etc.). With this strategy they have been successful in getting several Venture Capital grants over the first few years of existence (Nantero, 2001 and Nantero, 2003a). The company plays an important role in the development of non-volatile memories based on nanotubes. 15 Lithography is a common method used in the computer chip manufacturing industry to produce desired structures in materials. Session 4 TALES FROM THE FRONTIER 50

51 IV 1 Tracing dynamics of expectations Society The scientific developments, and understanding of nanotubes production and characteristics 16 have led to expectations on the level of the society. A spokesperson in favour of nanotube developments is Richard Smalley (Rice University, Houston, Texas). Considering the following statements from Smalley (Ball, 2001:1): Nanotubes will be cheap, environmentally friendly, and do wonders for humankind. With this statement Smalley stipulates (from a scientific point of view) a very bright picture for nanotubes. However, there are other voices that agitate against this picture. Arnall (2003:7) for example states that: Carbon nanotubes are already found in cars and some tennis rackets, but there is virtually no environmental or toxicological data on them. As well as the ETC group (2003a:72) who propose that: governments should declare an immediate moratorium on commercial production of new nanomaterials (editorial: which includes nanotubes) and launch a transparent global process for evaluating the socio-economic, health and environmental implications of the technology. Arnall arrives at the same conclusion as the ETC group, which is applying the precautionary principle. From the market side the expectations focus on the possibilities that nanotubes might have to improve or revolutionise existing products. Already nanotubes are used to strengthen materials (e.g., tyres or tennis rackets) and production facilities are set up to deliver the demand for nanotubes (MWNT and SWNT) that is expected for the coming years. Arnall (2003:14) states here (taking a market perspective): the most important material in nanotechnology today. Such statements give rise to believe that nanotubes have much to offer in terms of applications. This is indeed the case when we look at how broad the application areas for nanotubes are generally addressed: pharmaceuticals, electronic devices, material production, energy technologies, etc. Concluding, the expectations on the societal level show a contradiction in the sense that on the one hand nanotubes are used without regulation and on the other hand there is a public call (from various groups) that regulation is needed. However, the fact that nanotubes offer great promises for various industries is acknowledged. Technological field After the discovery of the single-walled nanotube in 1993, possible applications of nanotubes for electronic devices came out of the scientific community (Ball, 2001). Using the straight tubes as wires in chips was one of the first options. In 1998, Cees Dekker s group at the Delft University of Technology (Netherlands) turned a nanotube into a transistor (the basic building block of computer chips). This made it 16 The research agenda on nanotubes has, in the last decade, changed considerably (Ball, 2001). In the early 90 s, especially the growth and (electrical, chemical and mechanical) properties were investigated in great detail. This research agenda shifted over the years towards the production capacity, controlled growth and applications of nanotubes. This also implicates that the variety of research topics has broadened. A broader spectrum is addressed, pharmaceuticals, new and enhanced materials, solar energy, etc. Also research is done from basic research (e.g., production capacity of single-walled carbon nanotubes) to applications and the production of the applications. Session 4 TALES FROM THE FRONTIER 51

52 theoretically possible to build processors (the central computational unit of personal computers) out of nanotubes. However, the expectations are that commercialising this option still lies far ahead (e.g., 10 years). Nanotubes can also be used to emit electrons. This opens up the possibility to use them as so-called field emitters to produce flat (even flexible) displays. The electrons emitted by the nanotube are pointed at a layer of phosphor, which as a consequence lights up. By making an array of pixels a screen can be obtained. In 1999 this was Jong-min Kim and his colleagues at the Samsung Advanced Institute of Technology in Suwon (Korea) did just that. The same technique can be used to produce vacuum-tube lamps in different colours that are twice as bright as conventional lightbulbs, longer lived and at least 10 times more energy efficient (Collins & Avouris, 2000). Since the publication of Rueckes et. al. (2000) where they introduce the architecture of non-volatile memory based on nanotubes it is clear that building these memories is one of the possible applications of nanotubes in electronic devices. We already discussed how this technology works, as it is the subject of this case. Cientifica 17 (2003) from a market perspective - points out that, when the opportunities for nanotubes in electronic devices is discussed: Big markets, apart from materials, in which nanotubes may make an impact, include flat panel displays (near-term commercialization is promised here), lighting, fuel cells and electronics. This last is one of the most talked-about areas but one of the farthest from commercialization, with one exception, this being the promise of huge computer memories (more than a thousand times greater in capacity than what you probably have in your machine now) that could, in theory, put a lot of the $40 billion magnetic disk industry out of business. It is clear that two applications are highlighted for expected short-term commercial use, flat panel displays and nonvolatile memories. Important to mention from a market perspective (computer chip industry) is the following, as articulated by Collins & Avouris (2000): Within this decade, the materials and processes on which the computer revolution has been built will begin to hit fundamental physical limits. Still, there are hugs economic incentives to shrink devices further, because the speed, density and efficiency of microelectronic devices all rise rapidly as the minimum feature size decreases. These developments call for new techniques to continue the ongoing miniaturisation in the computer chip industry. Nanotechnology should give the answers here. Research group The expectations of using nanotubes for nonvolatile memories started with the Nature publication of Charles Lieber s group (Rueckes et. al., 2000). In this article they presented a proof of concept of the suspended SWNT crossbar and the architecture of the possible application. The authors address some problems in order to actually make the nonvolatile memories. However, they do state that: The 17 Cientifica is the business information and consulting arm of CMP Cientifica, providing global nanotechnology business intelligence and consulting services to industry and investors worldwide. Session 4 TALES FROM THE FRONTIER 52

53 developments in these growth and assembly areas suggest that highly integrated SWNT device arrays, which represent the next step in our plans for molecular electronics, may be soon realized. What is meant by soon however, is not specified. In 2002, James Heath s group at the University of California (Los Angeles) reported that guiding the growth with an electric field could solve the problem of growing straight nanotubes (Diehl et. al., 2002). This scientific result solved the problem of growing straight nanotubes. Deposition of nanotubes into a parallel array (as is needed to create the hybrid solution) can be done in multiple ways. One can individually manipulate the nanotubes into the right position, however due to huge amount of nanotubes that needs positioning, this is no option. The second option is to use electric field to grow the (straight) nanotubes onto the substrate (Diehl et. al., 2002 as discussed above). A third way is to use a flow to guide the previous made straight nanotubes into position. Charles Lieber s group reported this method in 2001 (Huang et. al., 2001). These scientific results solved the problem of deposition of the nanotubes onto a substrate. The scientific results, as mentioned above, reinforced the expectations that non-volatile memories could be produced. This can be shown by a statement of Ball (2002) in an article where he discusses these results: This proof of principle raises hopes that a nanotube lattice could form computer memory, storing one bit of information at each junction. He refers here to the original article (Rueckes et. al., 2000), however these expectations are expressed 2 years later, after the new scientific results, which were obtained in the meantime. At the market side, other dynamics are present. Here, Nantero being the only company working on this technique tries to mature the given technique (proof of principle) into a usable method for producing nonvolatile memories based on nanotubes. Nantero was founded in 2000 and they received the first Venture Capital grant in 2001 (Nantero, 2001). The fact that Nantero received this grant shows that the investors, based on their expectations, show confidence in a success of Nantero. In May 2003 a prototype of 10Gb is ready and produced by standard semiconductor processes (Nantero, 2003c). In September 2003 Nantero receives the second Venture Capital grant (Nantero, 2003a). In the same month Nantero shows compatibility with lithography equipment from ASML (Nantero, 2003b). In February 2004 Nantero states they are on track for NRAM development. Here we see that Nantero over the years has built on the expectations that nonvolatile memories will be commercialised soon. These expectations were formulated in the following way. In 2002 Rueckes et. al. state: plans for molecular electronics, may be soon realized. In 2001 Nantero states: The company expects to deliver a product that will replace all existing forms of memory. In May 2003 Nantero states: Creating this enormous array of suspended nanotubes using standard semiconductor processes brings us much closer to our end goal of mass producing NRAM chips. In September 2003 Nantero states: Universal memory has been a dream for the semiconductor industry for decades we fell that Nantero s innovative approach using carbon nanotubes and a nanoelectromechanical design can make that dream a reality in near term. In 2004 Nantero states: The proprietary manufacturing Session 4 TALES FROM THE FRONTIER 53

54 approach will enable for the first time the ultra-large scale integration (ULSI) of carbon nanotube-based devices in a deep sub-micron semiconductor fabrication line. In the near future, these innovations will allow NRAM TM to be one of the first mass manufactured nanotechnology products. Within these expectations we see a shift from discovery (2000), via a prototype (May 2003) to manufacturing a proprietary approach (2004). Hence, the developments at Nantero show a clear way towards commercialisation. During these few years Nantero received a rather extensive media attention (37 articles in total) from technology as well as business journals. This is a clear sign that the media see Nantero as a promising company to take nanotechnology to the market. Also the fact that Nantero received the Venture Capital grants and especially the second round in 2003 is a sign that the investors expect Nantero to succeed, because venture capitalists perform extensive research before investigating. Concluding, different developments in basic research have given the building blocks that can be used to develop nonvolatile memories based on nanotubes. Nantero has taken up this challenge since Subsequent results in basic research as well as from Nantero have reinforced the expectations. These promising results have led to the second round VC financing for Nantero as well. IV 2 Tracing agenda building processes Society Are the expectations (concerns) about the toxicity of nanoparticles (incl. Nanotubes) taken up by policy makers and translated into programmes/regulation that reply to these concerns? Some initiatives have started over the last few years; we will mention the three most striking ones. First, the Royal Society and the Royal Academy of Engineering in the UK have incorporated these issues into their study (commissioned by the UK government, of nanoscience and nanotechnology. The goal is to carry out an independent study of likely developments and whether nanotechnology raises or is likely to raise new ethical, health and safety or social issues which are not covered by current regulation. Second, at the level of the EU, a 6 th framework programme has been approved. The NanoSafe project assesses the risks involved in the production, handling and use of nanoparticles in industrial processes and products, as well as in consumer products. The results are expected to indicate risks to workers and consumers, and to recommend regulatory measures and codes of practice. Third, the ETC group is working to develop an International Convention for the Evaluation of New Technologies (ICENT), which it hopes to bring before a United Nations agency in This should create a new mechanism that will make it possible for the international community to monitor the development of new technologies whose introduction could affect (positively and/or negatively) human health, the environment, or society s well-being (ETC group, 2004). Session 4 TALES FROM THE FRONTIER 54

55 Technological field None of the expected possible applications have come to a successful commercialisation yet. Therefore, basic research as well as efforts from the market side is focussed on realising the applications. The difference here is that basic research generally is conducted for all options and possible (not foreseen at this time) other applications, while the market side focuses (in general) on applications that are close to commercialisation. There are different topics that need (or needed) solving in order to be able to realise the expected applications. It is possible that a solution for a particular problem for one application can also be a solution for another application. An example is the problem that when nanotubes are grown, it is until now impossible to determine the electronic character (metallic or semiconductor) beforehand. Therefore, after growing, you end up with a mixture of metallic and semiconductor nanotubes. This is a problem, because often you need specific characteristics of the nanotubes in order to get a working application. To specify this example further, Cees Dekker s group at Delft University, showed in 1998 (Tans et. al., 1998) that a single semiconductor nanotube could be turned into a transistor. In order to make, for example, a modern processor for personal computers, you need to have in the order of 100 million transistors. Without the ability to grow nanotubes with the right characteristics beforehand, a processor based on nanotube transistors is impossible to produce. Research group Restrictive factors in the development of technologies are repeating phenomena that end up on the agenda of research groups. Scientists observe hurdles for further development of a promising application (guided by the expectations) and start to work on solving the problems at hand. This process can also be observed in the development of non-volatile memories. Typical problems addressed here were or are still the growth of straight nanotubes, precise deposition of the nanotubes on the substrate, and the separation of metallic and semiconductor nanotubes. Because not all problems were solved over the last years, Nantero adapted a (proprietary) hybrid solution that allows for some errors, and metallic and semiconductor nanotubes do not need to be separated. So, over the last few years some problems were solved and others were overcome by adapting the design. But, in the same period also the agenda changed from working on detailed technical problems (aiming at a prototype) towards scaling and production (making the technology ready for commercialisation). The last part was done in collaboration with ASML, which led to the fact that the technology is compatible with the existing lithography equipment (Nantero, 2003b). For the coming years Nantero not only aims at getting their product to the market, but also improving the existing technology to achieve even higher densities of suspended crossbars, which leads to larger memories. However, one of the problems cannot be influenced by Nantero, as stated in May 2003 (Nantero, 2003c): This process was used to make a 10Gb array now, but could easily be used to make even larger arrays the main variable now controlling the size is the resolution of the Session 4 TALES FROM THE FRONTIER 55

56 lithography equipment. At the same time, basic research groups work on fundamental insights in, for example, controlled growth of metallic or semiconductor arrays of nanotubes. Future scientific advances might improve the architecture of non-volatile memories and eventually to the realisation of the ideal solution (fig. 5 & 6). The same advances might also lead to more advanced architectures for other type of computer chips (e.g., processors). IV 3 Tracing emerging irreversibilities Based on the evidence on the dynamics of expectations and agenda building processes as presented in the previous two paragraphs, we will now present main findings for each of the different cells in the matrix in Figure 7. These findings also give answers to the questions as proposed in Figure 3. Society Technological field (Research) group Basic research Nanotubes as part of nanotechnology Next to the acknowledgement that nanotubes offer huge possibilities, there is an open discussion on the possible toxic effects of nanoparticles (incl. nanotubes) on the environment and inside the human body. Some organizations ventilate their concerns on this topic and a few research programs have been initiated to address these issues. Nanotubes in electronic devices The academic community addresses a wide variety of electronic devices based on nanotubes. These options are based on advances in the understanding of and the control to determine (beforehand) the characteristics of nanotubes. However, existing hurdles also restrain further developments. The timescales on which these applications might become viable differs a lot. Nanotubes used in nonvolatile memories Step by step the problems around producing predetermined nanotubes and applying them for nonvolatile memories are solved (straight growth and deposition). Nevertheless, still some hurdles have to be taken to make the ideal solution (fig. 5) possible. Market Nanotubes as part of nanotechnology Apart from the concerns on the possible toxicity, industry started to produce nanoparticles with a strong growing increase in capacity. The market then focuses on the possibilities nanotube applications promise to improve or revolutionise existing products. Nanotubes in electronic devices The market focuses on a selection of promising electronic applications based on nanotubes. The most promising are flat panel displays and nonvolatile memories. The fact that current semiconductor technology will reach the physical limits soon, gives a push on the market to come up with new solutions to continue the ongoing miniaturisation in the computer chip industry. Nanotubes used in nonvolatile memories Nantero tries to mature the technique (proof of concept) into a usable method for producing nonvolatile memories based on nanotubes. Over the years two rounds of Venture Capital were received and successful collaboration with ASML was established. In the coming years Nantero aims at getting their product to the market and to improve the existing technology. Figure 7: main finding located within the three-level framework. These insights and empirics give the opportunity to trace emerging irreversibilities that arose around nanotubes and more specifically nanotubes in electronic devices, and non-volatile memories based on nanotubes. We have shown that results of research groups directly gives rise to expectations for promising applications and changes the agendas for the future. Accumulation of research results (e.g., straight growth and precise deposition of nanotubes) solves the hurdles that before repressed promising applications to become reality. In the specific application we discussed in this paper this leaded (on the market side) to the realisation of a prototype of non-volatile memories of Nantero. Later on Nantero showed compatibility with existing lithography equipment as a next step in the realisation of a producible technology. We note that the scientific research results (straight growth, deposition), the prototype and the proof of compatibility are emerging irreversibilities. These developments showed the academic and the business community that the technology (or even nanotechnology) is actually possible of producing workable products for the electronic industry. Session 4 TALES FROM THE FRONTIER 56

57 The founding of Nantero, the allocation of two rounds of venture capital grants, and the collaboration with ASML indicate the emerging irreversibility that Nantero has become a central player in the realisation of highly integrated non-volatile memories. This fact changed the market side in the sense that an extra player emerged within the electronic industry. The scientific community (related to the application of nanotubes in electronic devices) changed in the sense that since 1993 more and more attention was drawn to nanotubes. This led to the recognition of a specific set of promising applications (the same process happens at the market side, although a more limited set of options is recognised). The fact that such selections can be made indicates that a shared agenda exist about what is useful to work on. This leads to the emerging irreversibility that more groups work on subjects directly related to the realisation of these promising options. At the level of the society we observed open discussions on different topics. This indicates a growing attention for various aspects related to nanotubes as part of nanotechnology. This did not lead to emerging irreversibilities yet, however can be marked as an indication for emerging irreversibilities. If this will occur in future, depends on the outcome of the ongoing debates, probably grounded by results from the lower levels of the three-level framework. Basic research Nanotubes as part of nanotechnology Market Nanotubes as part of nanotechnology Technological field Society (Research) group Nanotubes in electronic devices More research groups work on similar problems related to nanotube applications Nanotubes used in nonvolatile memories Research results Straight growth of nanotubes Controlled deposition of the nanotubes on a substrate Nanotubes in electronic devices Nanotubes used in nonvolatile memories Founding of Nantero and two rounds of Venture Capital Prototype of a 10Gb array Successful collaboration with ASML Proof of compatibility with existing lithography equipment Figure 8: Emerging irreversibilities located within the three-level framework. As mentioned in section 4, the levels in the three-level framework are interrelated. A few examples from the case where this is visible will now be highlighted. First, basic scientific results at the level of the research groups can influence the level of the technological field, because the results shape the expectations about the most promising applications. Second, sentiments at the societal level might influence the possibilities for the electronic industry to develop technologies that might receive negative publicity. This effect can also be reversed; extra incentives are present when a certain technology is received positively at the societal level. This are just two Session 4 TALES FROM THE FRONTIER 57

58 examples, nevertheless, by analysing a case using the three-level framework, it is possible to reveal these dynamics. V. CONCLUSIONS AND DISCUSSION In this paper we proposed a route to deal with the intrinsic uncertainties of a new emerging field like nanotechnology. The hopes, expectations and also the increasing social concerns raise questions about the possibilities to assess the ongoing developments. And while Collingridge s dilemma of control definitively applies in this case, the concept of emerging irreversibilities helps to locate the first signs of new socio-cognitive structures that will constrain and enable future developments. The three level framework makes it possible to gather findings of these first indications. Our attempts nicely fits into the historical trend of technology assessment methods to incorporate and exploit the actual technology dynamics at stake. A brief historical digression is helpful at this point. Technology assessment (TA) started in the early 1970s as an early warning method (Smits & Leyten, 1991), merely to inform parliaments about possible negative effects of new technologies. In the early 1980s it became clear that due to the rather unpredictable nature of the future of technology, this goal was too optimistic. Therefore, during the 1980s, TA developed towards a policy instrument, where TA is used in a sense of a continuous process to support policy-making. In the early 1990s the concept of constructive technology assessment (CTA) was suggested (Schot and Rip, 1997). The basic idea was to include, as much as possible, relevant social actors (e.g., scientists, policy-makers, firms) that will influence and be influenced by the new technology. Thus, by co-evolution of new technology and new embedding networks, CTA strives to play an active roll in the development of technology. As nanotechnology is still in the early phases of development, so coconstruction by all possibly relevant actors is not straightforward. Therefore, we suggested in this paper that a focus on expectations and agenda building is helpful, as these are phenomena that can be observed in situations that show a great deal of fluidity and open ends. The three-level framework allows the analyst to study different perspectives of a specific case and at the same time retain overview of the situation. By applying the framework dynamically it was possible to identify emerging irreversibilities that directly relate to the case. It can therefore be concluded that by applying the method as developed in section 3, insights in the fluidic situation and the dynamics of emerging technologies can be gained. Why is the tracing of emerging irreversibilities important in the light of constructive technology assessment? In general, CTA studies aim at assessing technological development in an active way in order to maximise the social embedding of the new technology. Our basic claim is that in order to appreciate and to influence developments in new emerging technologies, an understanding of the early dynamics is necessary. As reasoned above, applying the proposed method helps to gain this understanding. The same method, thus, will be useful as input for CTA studies in which the perspectives and actions of multiple heterogeneous actors are involved. Session 4 TALES FROM THE FRONTIER 58

59 Understanding the dynamics from the different perspectives gives insight in the different points of view of the actors involved in the CTA study. 18 Finally, we note that the emerging character of nanotechnology provides research opportunities for innovation and technology studies. The prevailing type of study in journals and books on technology dynamics is a retrospective analysis. The drawbacks of a retrospective approach are well-known: they tend to emphasize the dominant route that emerged as winner in the variation and selection process and, thus, to ignore the deeply fluid character of new emerging technologies in their first stages (Latour, 1987). To study nanotechnology while it is unfolding at this very moment gives the opportunity to observe (for example with the method proposed in this paper) the construction of the technology in a more symmetrical way. To conclude, the method proposed in this paper appeared useful to organise the data and to structure it into a credible story. By applying the method, insights are gained about the dynamics within the three levels and how the levels interact. These insights are valuable for understanding the dynamics of a particular technology and helps to trace emerging irreversibilities in the early phases of technological development. Acknowledgements The authors of this paper would like to thank Arie Rip and Ruud Smits for their valuable comments. VI. REFERENCES Arnall, A.H., 2003, Future technologies, today s choices: Nanotechnology, artificial intelligence and robotics; A technical, political and institutional map of emerging technologies. London, Greenpeace Environmental Trust, July. Ball, P., 2001, Roll up for the revolution. Nature, Vol. 414, November. Callon, M., 1995, Technological conception and adoption network: Lesson for the CTA practitioner. In: Rip, A., T.J. Misa & J. Schot, Managing technology in society. London, Pinter, pp Cientifica, 2003, The nanotechnology opportunity report. June, 2nd edition, Executive summary. Collingridge, D., 1980, The social control of technology. London, Pinter. Collins, P.G. & P. Avouris, 2000, Nanotubes for electronics. Scientific American, December, Vol. 283, Issue 6, pp Diehl, M.R., S.N. Yaliraki, R.A. Beckman, M. Barahona & J.R. Heath, 2002, Self-assembled, deterministic carbon nanotube wiring networks. Angewandte Chemie International Edition, Vol. 41, No. 2, pp For example, one of the components of a CTA study (at least in some approaches) is the formulation of socio-technical scenarios, which supports the actors to formulate their views on the future. These views are directly related to the social perspectives on the new technology. As the set of involved actors is heterogeneous, the developed scenarios will differ in outlook and consequences. The proposed method will be useful to locate the various socio-technical scenarios and to view them in light of the dynamics at the separate levels. In addition, the results and insights that are gained by applying CTA tools in practice can be fed back into theories of technology dynamics. Session 4 TALES FROM THE FRONTIER 59

60 ETC group, 2003a, The big down. Atomtech: Technologies converging at the nano-scale. January. ETC group, 2003b, Size Matters! News Release, April 14, ETC group, 2004, Playing god in the Galapagos. News Release, March 11, Huang, Y., X. Duan, Q. Wei & C.M. Lieber, 2001, Directed assembly of one-dimensional nanostructures into functional networks. Science, Vol. 291, pp Iijima, 1991, Helical microtubulus of carbon graphitic. Nature, Vol. 354, pp Kroto, H.W., J.R. Heath, S.C. O Brien, R.F. Curl & R.E. Smalley, 1985, C60: Buckminsterfullerene. Nature, Vol. 318, pp Latour, 1987, Science in action. Milton Keynes, Open University Press. Nantero, 2001, Nantero, Inc. announces $6MM in funding aims to rapidly develop nanotube-based universal memory. Nantero Press Release, October. Nantero, 2003a, Nantero, Inc. announces $10.5MM in funding developing nanotube-based nonvolatile RAM technology for licensing. Nantero Press Release, September. Nantero, 2003b, Nantero, Inc. announces collaboration with ASML compatibility of nanotube processes with ASML equipment proven. Nantero Press Release, September. Nantero, 2003c, Nantero, Inc. creates an array of ten billion nanotubes bits on single wafer standard semiconductor processes used. Nantero Press Release, May. Nantero, 2004, Nantero s Dr. Thomas Rueckes garners awards and acknowledges company on track for NRAM development. Nantero Press Release, February. Nemmar, A., P.H.M. Hoet, B. Vanquickenborne, D. Dinsdale, M. Thomeer, M.F. Hoylaerts, H. Vanbilloen, L. Mortelmans * B. Nemery, 2002, Passage of inhaled particles into the blood circulation in humans. Circulation, Vol. 105, pp Odom, T.W., J.-l. Huang, P. Kim & C.M. Lieber, 1998, Atomic structure and electronic properties pf single-walled nanotubes. Nature, Vol. 391, pp OECD, 1992, Technology and the economy: The key relationships. Paris, The Technology/Economy Programme. Rip, A. & R. Kemp, 1998, Technological change. In: S. Rayner & E.L. Malone, Human choice and climate change. Columbus, Batelle Press, Vol. 2, pp Roco, M.C., 1999, Nanoparticles and nanotechnology research. Journal of Nanoparticle Research, No. 1, pp Rueckes, T., K. Kim, E. Joselevich, G.Y. Tseng, C.-L. Cheung & C.M. Lieber, 2000, Carbon nanotube-based nonvolatile random access memory for molecular computing. Science, Vol. 289, pp Schot, J.W. & A. Rip, 1997, The past and future of constructive technology assessment. Technology Forecasting and Social Change, Vol. 54, pp Smits, R., 2002, The new role of strategic intelligence. In: Tübke, A., K. Ducatel, J. gavigan & P. Moncada-Paternò-Castello (Eds.), Strategic policy intelligence: Current trends, the start of play and perspectives, IPTS Technical Report Series, EUR EN, IPTS, Seville. Session 4 TALES FROM THE FRONTIER 60

61 Smits, R. & A. Leyten (1991), Technology Assessment: Waakhond of speurhond? Translation: Technology Assessment: Watchdog or tracker dog. Dissertation, Zeist, Kerckebosch, The Netherlands. Tans, S.J., A.R.M. Verschueren & C. Dekker, 1998, Room-temperature transistor based on a single carbon nanotube. Nature, Vol. 393, pp The Economist, 2003, One the Tube, May 8. Van Lente, H. (1993), Promising technology: The dynamics of expectations in technological development. Delft: Eburon. Van Lente, H. (2000), Forceful Futures: From Promise to Requirement. In: N. Brown, B. Rappert & A. Webster, Contested Futures. A sociology of prospective technoscience. London, Ashgate Publishing Company, pp Van Lente, H. en A. Rip (1998), The Rise of Membrane Technology. From Rhetorics to Social Reality, Social Studies of Science, Sage Publications, Volume 28 (2), Session 4 TALES FROM THE FRONTIER 61

62 Presentation 3 : Hopes and fears of nanotechnology Hopes range from useful option via enabling technology to building from the bottom Fears range from possible toxicity of nanoparticles to grey goo Collingridge s dilemma of control Constructive TA No. 2 Emerging Irreversibility Decrease of socio-cognitive fluidity Can be undone with increasing costs and efforts Examples: Launch of new journal or Roadmaps Tracing: Dynamics of expectations and Agenda building Using a three-level framework Basic research Market Social/Economic Research field Research groups No. 3 Session 4 TALES FROM THE FRONTIER 62

63 Expectations Method Sources Findings Results No. 4 Conclusions (paper) By applying the three-level framework for expectations and agendas emerging irreversibilities, related to the specific application of nanotubes, could be traced The subsequent identification of emerging irreversibilities adds to the understanding of the fluidic character of emerging technologies No. 5 Session 4 TALES FROM THE FRONTIER 63

64 Paper 4 : Scenario-based Roadmapping A Conceptual View FERNANDO LIZASO AND GUIDO REGER 19 ABSTRACT - Great efforts have to be devoted to create shared visions. Visions are desirable pictures of the future which guide organisations in making decisions. This paper aims at developing a new methodology that links both roadmapping and scenarios in order to plan the co-ordinated development and deployment of new and existing technologies and applications. It involves the identification, analysis, assessment and projection of both technologies and applications necessary to meet market needs under different future circumstances. It is supposed to help to avoid isolated and linear thinking and vagueness, as well as improve creativity, communication, collaboration and integration in science and technology planning, while giving people at organisations room to create the future and remain flexible with uncertainty. Our concept is based on a deep analysis of the literature and an example, which should illustrate how the method works. I. INTRODUCTION Technologies do not occur in isolation; they are instead part of a complex system of knowledge generation and transfer. Technologies may take years to be applied because of technical barriers, the magnitude of change it imposes, or even strategic issues [24, pp ]. Maintaining a stream of products/services to the market requires establishing appropriate knowledge flows, in order to achieve a balance between market-pull and technology-push. Managers have to understand the dynamics of technological innovation and patterns in its underlying dimensions [24], in order to choose those technologies that are of value for the organisation long before the market introduction. This requires pursuing long-term inventions for knowledge acquisition, engaging in medium-term advanced development projects to prepare technology for application, and executing near-term product/service and process innovations to maximise competitive advantage [3, p.96]. However, managing the appropriate balance between the application and extension of existing technologies and the generation and deployment of new ones is one of the most important and difficult tasks in technology and innovation management [23, p.52]. The pervasiveness, non-linearity, extent and pace of change of technical progress make the environment increasingly complex and volatile. It requires appropriate planning methods that link both technology and business objectives. This paper aims at developing a method to plan the development and deployment of new and existing technologies and applications, linking both together right from the 19 Institute for Innovation and Internationalisation (I3), University of Applied Sciences Brandenburg, Magdeburger Str. 50, D Brandenburg, Germany. flizaso@hotmail.com and reger@fh-brandenburg.de Session 4 TALES FROM THE FRONTIER 64

65 start. It involves the identification, analysis and assessment of technologies and applications for possible futures that can happen under different circumstances. The method developed here aims at improving effectiveness in science and technology (S&T) planning. It should foster commitment, risk taking and experimentation to create and achieve visions and improve flexibility to refine and adjust robust plans and strategies in route. Roadmapping and an advanced scenario approach were combined in order to point out conceivable S&T landscapes ahead. Visions, planning and roadmapping will be described in chapter two. Our new approach is developed in chapter three, by linking roadmapping and scenarios and by describing the process on the basis of Rapid Product Development (RPD) technologies for the automotive industry as an example. The main results are summarized in chapter four. II. VISIONS, CURRENT PLANNING AND S&T ROADMAPPING 1. Visions - the source of power and creativity in creating the future To survive and thrive, organisations need management sensitive to its context. First, it needs to perceive threats and opportunities outside, then to stimulate the change necessary to take advantage - systematically. Successful firms act in advance rather than react, by going ahead rather than by running behind. Although managers are sensitive to changes and seem to have developed individual capabilities for foresight, it is rather difficult to bring all the organisation s thinking, competences and acting together. People tend to react against the sharp, real events which usually result from a crisis rather than to act on the basis of weak, fuzzy signals. People s minds tend to grasp those signs which are part of its previous experiences, what is meaningful to their own view of the future [5, pp.30-48] [15, pp.69-92]. Most managers, in their desire to know and reduce uncertainty about the future, spend much time on answering the question: What will happen to us? In so doing, they attempt to predict it. Nevertheless, uncertainty is the only certain thing about the future. Managers should spend more time on answering the question: What will we do if this or that happens? Realising the fact the future is plural, managers need to think alternative paths to the future in order to remain flexible against uncertainty. That helps to anticipate possible futures, build memories of and prepare people for them [5, p.30, pp.49-53]. The further managers look into the future, the less they know about S&T. In this sense, there are: technologically achievable futures, argumentative achievable futures, visions and utopias [8, pp.34-37]. The first are well defined end-states, technically achievable at present, unless no resources are available. Argumentative achievable futures are theoretically accomplishable, but not in practice. Visions are images of desirable, conceivable futures that drive commitment. However, the paths and Session 4 TALES FROM THE FRONTIER 65

66 conditions to achieve them are still uncertain. Utopias are too far, unconceivable futures. The S&T knowledge available does not allow their achievement. Visions [20, pp ] capture the sense of future images that guide firms in making decisions. Management aims at achieving objectives, which represent visions. Objectives help managers to motivate people and measure the firm s performance. However, people do not become long-term oriented because of accuracy in the likelihood of quantitative foresights nor because they have to, but because they want to. Only when people can visualize a better future, they can begin to create it. Shared visions are the answer to the question what do we want to create. A vision is truly shared when many people have a similar picture, and are committed to it, because it reflects their own personal vision. It leaves room for people to shape their own future, fostering risk taking and experimentation. Such visions take time to emerge, and require ongoing conversation: they grow as product of interactions of individual visions, and spread because of a reinforcing process of increasing clarity, enthusiasm, communication and commitment. 2. Current strategic technology planning Planning helps managers to identify the objectives and develop a plan sequence to achieve them. It is a process of thinking which results in decisions about courses of action to be taken in the future. Its outcome, a plan, is a commitment to specific courses of action in terms of the what, who, when, where, and why of a certain situation [2, p.306]. Therefore, the establishment of premises, assumptions, and known conditions about future developments or events that will affect the operation of plans becomes of great importance for successful planning. Successful planning must consider what has been happening, what s going on and what could happen. Strategic planning involves decisions that identify the market needs a company wants to address, the products/services to satisfy those needs and the technologies needed to place those products/services on the market. Those companies who want to stay ahead have to make the strategic management of technology an integral part of their business strategy [1, p.45] [21, p.210, 237]. It involves identifying, selecting, and developing the technologies to support the products/services and processes requirements. Current practice, attempts to develop plans toward long-term visions [12, pp.65-69] based on what could happen (approximations) rather than on inflexible, pure trend extrapolations. It aims at identifying gaps and incongruities between the vision and present actions. In a changing environment, managers obviously need decision-making aids in an understandable and user-friendly manner. A study on technology foresight activities of 26 multinational companies reveals that the firms investigated use numerous different methods for technology foresight with different intensity [17]. There is a definite predominance of methods based on the interaction between different players and which are person and communication-oriented. Great importance is attached to methods involving a high proportion of interviews with internal or external experts, and to 'teasing out' ideas in meetings or workshops. A surplus for a method or tool is if it is suitable for visualization like e.g. road maps or scenarios and Session 4 TALES FROM THE FRONTIER 66

67 enhances sharing the results. Since results must be readily transmitted throughout the organisations, the focus is on communicating results rather than on analytical elegance of content in foresight. Among quantitatively oriented instruments, on the other hand, rather simple tools predominate. 3. Roadmapping - linking science and technology with business objectives Roadmapping [7] [10] [11] [13] [14] [16] [19] [21] is a decision-making process that links S&T development with company goals and strategies. It is a look at the future to anticipate what will happen and what has to happen for moving ahead [4]. It provides a consensus view or vision of the S&T landscape ahead, as well as ways to identify, evaluate, and select alternative paths that can be used to achieve S&T objectives. It helps organisations, at either the corporate, industry/discipline (even at crossindustry/national or international) or government levels, to collaboratively identify future product needs, map them into process/product/service technology alternatives, and develop plans to ensure that the required technologies, skills and resources, will be timely available. Roadmapping and its outcomes - the roadmaps - can be used in enhancing communication and consensus achievement within the organisation and with external audiences (e.g., business functions/units, suppliers, customers, investors); resources allocation; skills and capabilities development; and identifying gaps, barriers and opportunities in S&T programs. Roadmapping provides information to make better decisions by identifying critical S&T areas of high potential or strategic value and technology gaps, when it is not clear which technology alternative to pursue, how and when a technology will be available, or when it is necessary to coordinate the development or acquisition of multiple technologies. Roadmaps portray the evolution of markets, products and technologies to be explored, together with the linkages between the various perspectives (see Fig. 1). They are time based plans that help organisations to determine where they are, where they want to go and how to get there. Roadmaps graphically represent the critical system requirements in the field (e.g. business objectives, product and process performance targets) and the alternative technologies and milestones required for meeting them. They are portrayals with quantitative and qualitative attributes of the structural and temporal relationships among S&T elements, as they become applications over time. Ordinarily, the spatial dimension depicts system elements (nodes) of a targeted S&T field and their relationships at a given point in time, and the time dimension accounts for the evolution of S&T capabilities. Session 4 TALES FROM THE FRONTIER 67

68 Business/ Market M1 M2 Purpose know-why Product/ Service Capability/ System P1 P2 P3 P4 Delivery know-what Technology/ Skills Resources T1 T2 T3 T4 VISION Resources know-how R&D Project (science) RD1 RD2 RD3 RD4 RD5 RD5 time (years) Timing know-when Schematic S&T Roadmap Figure 1. Schematic representation of a generic S&T roadmap nodes and links, showing how all perspectives involved can be aligned. Source: adapted from Kostoff, Ronald N., et al., Roadmaps can take different forms. The multi-layer roadmaps including market, product and technology layers are the most common ones though. Typically, the top layer [16, p.13] refers to the purpose that is driving the roadmap ( know-why ), while the bottom layer relates to the resources ( know-how ) that will be deployed to address objectives (see Fig. 1). The middle layer provides a bridging/delivery mechanism between the purpose and resources ( know-what ). It ordinarily focuses on product development, as this is the route through which technology is deployed. Services, capabilities, risk or opportunities may be appropriate for the middle layers, to understand how technology can be delivered to provide benefits to the organisation and its stakeholders. The content of the roadmap in terms of type and amount of information depends on purpose, level and time horizon of the roadmapping effort. The time dimension [16, p.13] can be adapted in terms of time horizon, ordinarily short in rapidly changing or emerging sectors and much longer in slowly changing or mature industries. A logarithmic scale can be used, with more space allocated to the short- vs. long-term, owing to the large amount of information to map out in the near future. Space can also be allocated for long-range considerations, together with the current situation, with respect to the competition or to define gaps between the current position an the vision. The value of roadmaps largely depends on the relevance and validity of their content. Roadmaps have to address specific needs or questions, and to periodically be reviewed to stay current with the information they portray. Since each application can be different in many respects the approach has to usually be customised in terms of process, purpose, graphical format, frequency of revisions to suit each particular Session 4 TALES FROM THE FRONTIER 68

69 circumstance. The range of different roadmap types [7] [10, p.107] [13, pp ] [16, pp.5-10] reflects this facts. There are four prerequisites in roadmap development [11, p.10]. First, roadmapping is (1) needs-driven or mission-pull. The implication is that R&D will only take place when there is a predefined application. Secondly, it is (2) fully integrated; the process brings problem holders and solution providers together into an integrated and cooperative team consensus to understand the process to reach end-states. Thirdly, roadmapping has to be (3) comprehensive: it has to address the near-, mid-, and longterm S&T needs, objectives and full range of potential solutions, considers the impacts of all system interfaces, and identifies the key elements and functions that must be integrated. It provides ways to identify, evaluate, and select the technology alternatives which can be used to satisfy the needs and objectives. Fourthly, roadmapping has to be a (4) credible and defensible process which documents the reasons for decisions and gives framework to extract controlling indicators. III. THE NEW APPROACH: LINKING ROADMAPPING AND SCENARIOS 1. The new approach - the process at a glance Roadmapping is about assuming a given future(s) and providing paths to get it, by means of a certain amount of foresight and a certain amount of consensus. Then, the aim is to evaluate what is technically possible, desirable and expected, and to understand what needs to happen for moving ahead. Otherwise, roadmaps become a one-way street rather than a road map. Scenarios [6] [8] [9] [10] [18] realizes the future is plural. Scenarios are descriptions of conceivable futures that make managers understand what is going on and gain sensitivity to what if questions. It requires incumbents to identify the critical factors affecting a targeted system, the relationships among them, and the sequence of and connectivity between events that would lead to different plausible futures. Scenarios helps to bridge the gaps among thinking and action when addressing specific questions; overwhelming scenarios are often of little or non practical value. A comprehensive method for strategic technology planning is to link both roadmapping and scenarios. Technology scenarios describe the development of technology systems, through factors that involve technologies of strategic value because of their adequacy and potential to perform the tasks/functions or achieve the features the technology system has to, or can, accomplish over the time. Alternative directions, extent and pace of technological progress can make scenarios happen at different points in time. S&T knowledge allows experts to map out the future of technology systems comprehensively, by building the plausible scenarios, and assessing their possible years of occurrence, so that, a technology scenarios sequence along the targeted planning horizon results. It allows managers to identify, analyse and select what technologies would constitute the possible futures and, very important, what are the particular performance parameters and targets needed to stay in. As this Session 4 TALES FROM THE FRONTIER 69

70 happen, it is possible to determine what to do, when to do it, and how and why it has to be done: managers can decide on resources leverage and alternative paths to pursue, to move ahead from scenario to scenario into the future. The implication is that (1) a particular scenario approach constitutes the intrinsic fundamentals to produce roadmaps and, (2) a set of product and process technology roadmaps becomes the scaffold of a Technology Roadmap Architecture (TRMA). It accounts for two dimensions that link sets of, at least, market, product/service and process scenarios addressing particular questions in an associative and aggregative manner, in a technology-push-requirements-pull approach.the process (see Fig. 2) consists of six steps. The first one, (I) Roadmapping Preparation, is going to be devoted to determine the need which will drive each roadmap of the appropriate TRMA, and to plan the roadmapping effort, as well. Identifying the influence factors containing candidate technologies and selecting the key ones strategically constitute the goals of the step (II) System Analysis. Given the key factors, it is necessary to establish their possible futures states: (III) Scenario Projection aims at establishing a catalogue of portfolios of conceivable, alternative technological developments (projections) each factor could take along the planning horizon, because of direction and extent. In (IV) Scenario Building consistence and cluster analysis come into action to build and define the amount of plausible scenarios that fully describe the future. Thereafter, the full range of possible futures, the future space, is going to be graphically represented by means of multidimensional scaling (MDS) as a map. (V) Session 4 TALES FROM THE FRONTIER 70

71 Time Assessment is critical; experts look for the technological solutions and expected years of occurrence of the projections, which in its turn, make the scenarios happen at certain points in time. Translating the results into the map makes it achieve the time perspective. Lastly, (VI) Road-Mapping provides a framework to identify and select roads to get the future by comparing the constituents of the TRMA. 2. The process step by step Given an overview of the process, each of the steps will be described as following. An example, a roadmapping effort of the technologies that Rapid Product Development [9] [22] in the automotive industry involves, illustrates the logic of the methodology. Roadmapping Preparation (I) First, it is obviously necessary to define the roadmapping effort and, to plan its realisation and implementation, including sponsoring, planning, execution and controlling. First step s major objectives are: define the need which will drive the roadmap and the structure and dimensions of the TRMA required. Therefore, it is important to define and analyse the targeted system; and to identify the maturity stages in which the industry, candidate technologies and organization(s) are, and the phase they are entering. That gives an idea of both the complexity and unpredictability of the environment, and therefore, the type of scenario project to be performed and the incumbents to be invited. Like most mature industries, the auto industry is trying to go out of the mature stage that characterizes the bulk of it, by adding value through mass-customisation. Advances in both process and product technologies, and social trends promise change the way in which cars are going to be developed, produced, distributed, marketed, recycled and used in this direction. The industry has, for example, to develop cars more customer oriented, quickly and efficiently. RPD is a manufacturing practice which not only aims at promoting process shortenings, but also at improving customer satisfaction and loyalty, and at reducing costs. It involves dissimilar technologies such us Computer Aided Design (CAD), Virtual Reality (VR) and Rapid Prototyping & Manufacturing (RP&M), and management techniques like e.g. Computer Supported Co-operative Work (CSCW), Simultaneous Engineering (SE) and Knowledge Management (KM). The industry tends to develop cars as virtual as possible in immerse environments through configurable, intuitive, knowledge based CAx-Systems; to perform tasks simultaneously and, if possible, automatically, without any time and place constraints in an integrative manner; and to translate product data into the real world directly from digital models. Although trends are known, the paths to, and extent and timing of change are still rather fuzzy. Currently, technology incompatibilities and inconsistencies often arise. Moreover, since not only large corporations take part in the system but also small, high specialized manufacturers, suppliers, research institutes, and industry/trade associations, too, coordinating diffusion and performance improvements of so many technologies at different maturity stages becomes difficult tasks. Then, experts from the once in large corporations, to special car makers, to Session 4 TALES FROM THE FRONTIER 71

72 suppliers, to research institutes, to lead users, and industry/trade associations should participate in road mapping RPD technologies. Since the system has became so intricate and unforgiving, a change in either product or process requires a corresponding change in the other. It requires a TRMA including product/process technology roadmaps, and market need scenarios along the value chain at both the corporate and industry levels. The example refers to a process technology roadmap at the industry level. System Analysis (II) System Analysis aims at establishing a catalogue of influence factors, and selecting the key ones strategically. The amount and type of factors a scenario project can include is finite. Technology scenarios concentrate on technological issues, with focus on the system under scope and conveniently aggregated to the project level. Decomposition [9, pp ] is a method for systems analysis. Products and processes can be separated into their constituent elements: components or groups of them and steps or phases, respectively. There are associative and aggregative (spatial and functional) relationships among system parts that refers to constituent and level relationships. That constitutes the scaffold of the TRMA along and through the value chain. To what extent it is necessary to break down the system will largely depend on the roadmap level and purpose. At the industry level, RPD can be roughly decomposed in four steps: (1) Product Planning, (2) Product Construction, (3) Production Preparation and (4) Product Testing [22, pp.4-6]. Decomposition helps to identify the relevant functions/features the system parts have to, or can, perform/accomplish under particular performance and quality requirements/standards. Given the key functions/features, experts can identify the technology fields comprising the many technologies that can be involved in accomplishing those critical functions/features. These technology areas will form the influence factors catalogue. Regarding the example, power walls, Computer Automatic Virtual Environment (CAVE), Virtual Desktop, Head Mounted Displays (HMD) and responsive workbenches are some but a few of the means to digital product data representation and visualisation. Keyboards, data/pinch globes, tracking systems and voice recognition are means to digital product data entry. Both groups of means are human-machine interface technologies. Human-Machine Interfaces constitutes a technology field, and therefore an influence factor for RPD. Not all factors affect the system in the same way nor to the same extent. Incumbents decide on what factors to consider. Even when a factor influences the overall system, its relevance depends on particular circumstances, including industry maturity, strategic issues, etc. The structure of the system of factors can be well visualised by means of influence matrix and system grids [6, p.192] [18, p.33-41]. Thereafter, it is possible to identify the role each factor plays to the system, and strategically select a set of ones on behalf of the type of scenario project [6, pp.80-81, pp ] [8, pp ]. It helps to understand the system characteristics, to interpret results, and Session 4 TALES FROM THE FRONTIER 72

73 to select indicators for controlling and monitoring. RPD key influence factors are e.g. Virtual or Physical Modelling & Prototyping, Reverse Engineering, Data Management, Change Management. Scenario Projection (III) Scenario Projection means establishing the future states each key factor can take along the planning horizon. First, it requires to determine the possible directions, then the extent. Although the future state of some factors (non-critical) is straightforward, most of them (critical) can take several ones. Alternative projections make different scenarios happen, and therefore, determine the morphology of the future space [9, pp ]. Hence, the projections assumed for a factor must be clearly different, and have to describe the full range of possibilities. For some factors, there is only one metric along which to measure their development. Many others are instead multidimensional [6, pp.84-86] [8, pp ]; several facets come along when attempting to describe their state. Two dimensions have at least to come up to fully describe a factor state. Since the dimensions which describe a factor state are virtually independent, different futures can occur. A change in one dimension does not necessarily imply a corresponding change in the other. Both dimensions can, or cannot, change in the same direction and magnitude at once. One can even change at expenses of the other. Herewith, the idea of projections-portfolios comes along. Human-Machine Interfaces technologies aim at improving productivity, but different technologies address different problems/questions: while some technologies aim at better representing the information which digital product models portray by seeing, hearing, etc., tracking users in real time with higher or lower degree of realness; technologies to enter data provide ways to work on those models with higher or lower degree of interaction by eye-tracking, hand-typing, touching, voice instruction, etc. In other words, to what extent it is possible to be immersed in the environment of and interact with digital models are the most important capacities of Human-Machine Interfaces technologies for RPD. Hereby, the two dimensions which describe the state of Human-Machine Interfaces were called dousing (ordinates) and interactivity (abscissas). In this fashion, the relevant dimensions, which describe each factor state have to be identified and labelled. Session 4 TALES FROM THE FRONTIER 73

74 To what extent can both capacities be achieved depends on technology performance. Thus, alternative projections can be established in advance on the basis of the potential of performance along the portfolio dimensions. Physical Modelling & Prototyping technologies translate product data into real things. The capacities to produce physical models/prototypes with higher or lower degree of realness, and to require more or less efforts to do it, called validity and effort respectively, depends on the performance of the involved technologies. As long as technologies mature, validity and effort will anyhow approach their physical limits some day. Consequently, each projectionsportfolio consists of two dimension-s-curves of the type performance vs. time. Nevertheless, validity and effort are no metrics, they are indeed indexes that come to sum up the many metrics involved in performing/achieving system functions/features. It was said, that both the metrics and targets along which to evaluate the adequateness of technologies change over the time, owing to maturity stage or strategic issues. For RP&M technologies, for instance, no metric is preferred yet, because the industry is still looking for a dominant design. It largely depends on the technological approach itself and the RPD function to carry out. Fig. 3 shows how the projections-portfolio of Physical Modelling & Prototyping technologies qualitatively describes the full range of possibilities along both validity and effort for a targeted planning horizon. Technical progress makes projections Session 4 TALES FROM THE FRONTIER 74

75 happen from left to right and bottom to upwards and, moreover, once performance is achieved, it does not go back. It is the case of co-ordinated development in both "Y" and "X" dimensions, it moves ahead from A to C. If progress is in either "Y" or "X" lower or higher than in the other, B or D comes across. In other words, B and D are alternative paths to C. It is the case, an intermediate, but direct path from "A" to "C" is required, a fifth "E" projection becomes the portfolio centrepiece. Summing up, dimension-s-curves, which aggregate the many curves along the metrics involved in accomplishing system functions or features, provides the framework to establish projections. In this regard, while the catalogue of projections-portfolios is repository of the possible directions and alternative paths into the future, the dimensions are repository of the many changing performance metrics. That helps define and have under scope the exploration area, from technologically achievable futures to visions; achieve consensus and avoid vagueness; organise knowledge; link basic to applied research; and differentiate invention from innovation. It triggers the creative tension and aspiration that enhance commitment to vision creation. S&T projections-portfolios not only avoid floating visions into space, but also the pitfalls of utopia. Scenario Building (IV) Given the projections, it is possible to build the scenarios by means of consistencematrix and cluster-analysis; they help to identify a number of the most consistent projections bundles containing one projection of all factors, and to cluster them according to their similarities. Consistence-matrix [6] [8] [9] [18] enables the numeric assessment of impacts/interrelationships among projections by pairs, in order to rank the huge amount of possible projection bundles according to consistence. For most scenario technique approaches, the most two/three consistent projection bundles are definitive scenarios [18, pp.49-56]; that makes parallel projections not to occur, and in its turn, the efforts done in developing projections-portfolios of no value. Reality is far more complex. The inductive approach used here considers hundreds of all possible bundles [8, pp , pp ] in order to select the appropriate number of rawscenarios, groups of bundles, which result from a cluster analysis [8, pp ]. By identifying the projections, which most often take part in the clusters it is possible to build the scenarios. A picture of the future can better be achieved by means of MDS: it uses geometric proximity to show similarities among objects as a map. Projection bundles (objects) distance from each other in a magnitude that correspond to their dissimilarities [6, pp.93] [9, pp ]. The map (see Fig. 4) consists of few removed, but densely populated islands (raw-scenarios) of spheres (bundles), which can depict additional information by sizing or colour coding (e.g., consistency, affiliation). Seven scenarios cover the spectrum future of RPD, namely: Work Benches(I), 24Hs(III) and Core Rigidity(VII), dominated by conventional technologies; Boutique(V) characterised by advanced technologies which remain expensive and unreliable for the bulk of the markets and; Virtual as Real(II), High- Tech RPD(IV) and High Performance(VI), in which high performance virtual environments, self-configurable and knowledge based CAx-Systems, and advanced planning and QM tools that enable one-to-one communication to address market Session 4 TALES FROM THE FRONTIER 75

76 niches or individual customer needs dominate. The map shows that the future space of RPD range from conventional (right-like scenarios) to advanced technologies (left-like scenarios); it has severe implications to product development and the hallmarks of its outcomes - the cars the product development process are supposed to produce. Time Assessment (V) Having scenarios floating somewhere in the future does not enough, unless we identify what and when things need to happen to achieve them. Organisations decide on the paths to pursue because of strategy. Therefore, incumbents are still trying to identify both the full range of possible futures and paths to get them. A data base containing: (1) the system parts, (2) the portfolio dimensions, (3) the metrics and (4) performance targets required to perform the functions/features each system part has to/can accomplish along the planning horizon, and (5) the technologies supposed to do it, for each of the factors (see Fig. 5) brings the critical system elements together for analysis and documentation. It provides ways to identify, Session 4 TALES FROM THE FRONTIER 76

77 analyse and select the technologies supposed to provide solutions. Furthermore, the years in which technologies can a priori achieve the performance targets become inputs of the table. Given the years, it is possible to define the years of occurrence for each projection, and therefore of the scenarios. First, the table helps to identify technologies supposed to achieve the performance targets for each of the relevant metrics for the system parts for each key factor that, in its turn, make many of the projections happen. Particular technology combinations or efforts spent on certain developments can make more than one projection of the same factor to occur. Alternative solutions for the same projection can come along, too. Performance gaps and/or incompatibilities among solutions might also appear. As this occurs, solutions, if they were needed, have to be found or decision-points, uncertain or risky areas to be to highlighted. It results in a catalogue of technologies that will make projections happen. Given the solutions, it is necessary to look at the expected years of occurrence of projections. Projections will happen at the year in which the candidate technologies Session 4 TALES FROM THE FRONTIER 77

78 achieve the predetermined performance targets. Divergences in the years in which technologies achieve performance can make projections succeed in a period to be described by simple statistics rather than at a point in time, however. Since an expected year of occurrence corresponds to each projection and, each projection bundle has its projections profile, each projection bundle occurs at a point in time. Road-Mapping (VI) Translating results into the map, by colour coding for instance, makes it achieve a time perspective (see Fig 6). Although almost all the RPD scenarios involve bundles occurring twenty years time, they tend to happen from right to left in time (each five years). Bundles supposed to occur in five to twenty years form the right-like scenarios. However, there are - only - two bundles of Work Benches and 24Hs happening twenty years time; the bulk of bundles of Work Benches are going to occur in fiveten years, and most bundles of 24hs will instead happen ten to fifteen years time. Left-like scenarios will occur twenty years time. In the middle-up of the picture, the scenario V will happen in fifteen-twenty years. Technology is supposed to make organisations move to the left, nevertheless, there are alternative paths to get the future. Organisations can turn left to move ahead into the future, or still developing technologies further to remain on the right-like scenarios. The path for the organisations that decide to remain on the right of the map is plenty of bundles in the mid-term, but rather inconsistent in the near- and long-term future. Those attempting to cross over to the left, on the other hand, have to be able to bridge the - technology - gaps. At first glance, organisations that move to the left can do it over Core Rigidity and Boutique. It does not mean, however, that the companies which firstly remain on the right of the picture will never get the left. They can do it later, directly from Work Benches or 24Hs, moving straight on horizontally to the left without stops (unless E projections make new in-between-scenarios happen in Session 4 TALES FROM THE FRONTIER 78

79 further revisions of the roadmap). There are indeed many other possible paths, than these obvious highways. How and when organisations move depends on market needs and company strategy; organisations have to decide on, according to needs and possibilities. Incumbents have to carefully evaluate which path(s) to pursue, taking into account the overall inward/outward information available. So far, the technological possibilities have been explored, but the requirements-pull side of the approach is still pending. The driving need of a roadmap has to be either subordinated to or be geared with another ones. The TRMA consists of process and product roadmaps and market scenarios. A 3D matrix (see Fig. 7) helps to identify the product (P) and process (T) scenarios supposed to address the customer needs along the planning horizon which market (M) scenarios portray. In so doing, organisations can decide on where to go and how to get there. Regarding the example, it allows incumbents identify alternative RPD-scenario sequences able to produce the types of cars required to address market needs over the time, in a co-ordinated manner. Roadmapping is iterative. Once the targeted futures and paths to get them are identified, a revision of the roadmaps can be required. Since requirements/conditions were established a priori, in order to explore the range of possibilities, it might occur the roadmaps have to be refined. Last but not least, managers need to also plan technology diffusion to stay on the markets they want to address. It requires to co-ordinate technology development and deployment in order to timely achieve expected degree of diffusion. As this occurs, the technology paths achieve the one another feature: size. IV. CORE FINDINGS AND EXPECTED BENEFITS OF THE APPROACH The approach allows managers to not only define and map out the S&T landscape to be explored, but also to navigate it from bundle to bundle into the future towards Session 4 TALES FROM THE FRONTIER 79

80 visions while looking at their constituents - the candidate technologies, whose direction and extent of development are known. Incumbents, who strategically develop roadmaps along and through the value chain, identify what alternative sequences of technologies and applications can better address future market needs over time, in a coordinated manner. By picking out selected projection bundles, managers determine what, how and when critical technologies and applications have to be developed and deployed, leading to consensus on priorities and path forward. As this occurs, incumbents can decide on resources leverage to substitute, accelerate, continue or stop S&T programs to move from scenario to scenario. Summing up, the several tools and concepts applied make this approach new in some respects. Maps are moments in the process of decision making. As things happen, the S&T landscape is expected to change, forcing incumbents to systematically refine and adjust plans and strategies to stay in route. In this respect, since the approach gives room for incumbents to pursue their own strategies, they do not only drive their own paths into the future, but also benchmark the paths each others are pursuing. Secondly, both the very intrinsic ability of scenarios to process qualitative/quantitative data, and to integrate other tools for processing data and communicating results, as well as the TRMA based on decomposition, make the approach adaptive and flexible enough to deal with the changing complexity and unpredictability of the context and users requirements over the time. Incumbents decide on the roadmaps to be done and the type of scenario project needed to deal with on behalf of industry/technology maturity and resources availability, in an iterative manner. Thirdly, the approach aims at providing framework to proceed as objectively, transparently, fully integrated, holistically and rationally as possible, in order to help management in its broadest context to strategically, comprehensively and systematically think about possible futures. Nevertheless, the balance among hard and soft data that is most suitable depends on particular circumstances (e.g., resources and information availability). Fourthly, unlike any scenario approaches, projections-portfolios based on S&T knowledge make the definition and exploration of the roadmapping area for the whole planning horizon in only one-step possible. It not only contributes to hold vision while remaining committed to seeing reality, but also to identify alternate roads to desired end-states while not entering the pitfalls of utopias. Portfolio-projections are of great value; they become repository of alternative directions and extent of technical progress, including the relevant metrics involved. Although R and D are closing up steady, a better coordination and transition of activities, and/or integration of disciplines in S&T development is required. Bringing both technologies and applications right from the start together in the planning process through the TRMA is supposed to be a modest contribution to the problem of achieving appropriate balance between technology-push and market-pull, and to better managing - the pace, direction and extent of - innovation. Session 4 TALES FROM THE FRONTIER 80

81 Linking S&T roadmapping and scenarios provides a framework to care about both weaknesses and strengths of each method. The power maps and scenarios have to communicate is impressive. The approach lets managers imagine on a trip, driving a company through the S&T landscape ahead. Also, like in travelling, however, managers lack the skills and budget to drive all the paths or to visit all the futures. Still, many scenarios will never happen and paths never exist. But developing plans includes acquiring the information of the terrain ahead as well as likely contingencies one may encounter. By exploring the terrain ahead, managers achieve the memory of the future required to deal with it. Session 4 TALES FROM THE FRONTIER 81

82 V. REFERENCES [1] D. L. Babcock, (1996) Managing Engineering and Technology. Upper Saddle River, NJ: Prentice Hall, 2 nd Edition. [2] M. K. Badawy, (1995) Developing Managerial Skills in Engineers and Scientists: Succeeding as a Technical Manager. New York et al.: Van Nostrand Reinhold, 2 nd Edition. [3] K. B. Clark and S. C. Wheelwright, (1993) Managing new product and process development; Text and Cases. New York et al.: The Free Press. [4] J. Coates, (1999) Boom Time in Forecasting. Technological Forecasting & Social Change New York: Elsevier Science, vol. 62, pp Reprinted by Joseph Coates Consulting Futurist. Retrieved 7/15/02 World Wide Web [5] A. De Geus, (1997) The Living Company. London et al.: Nicholas Brealey Publishing. [6] A. Fink, O. Schlake and A. Siebe, (2001) Erfolg durch Szenario-Management. Prinzip und Werkzeuge der strategischen Vorausschau. Frankfurt et al.: Campus Verlag. [7] M. L. Garcia and O. H. Bray, (1997) Fundamentals of Technology Roadmapping, Albuquerque: Sandia National Laboratories Retrieved 8/25/02 World Wide Web, [8] J. Gausemeier, A. Fink and O. Schlake, (1996) Szenario Management. Planen und Führen mit Szenarien. München et al.: Carl Hanser Verlag, 2. Auflage. [9] J. Gausemeier, P. Ebbesmeyer and F. Kallmeyer, (2001) Produktinnovation, Strategische Planung und Entwicklung der Produkte von morgen. München et al.: Carl Hansen Verlag. [10] H. Geschka, J. Schauffele and C. Zimmer, (2001) Explorative Technologie- Roadmaps, Eine Methodik zur Erkundung technologischer Entwicklungslinien und Potentiale, in Technologie-Roadmapping, Zukunftsstrategien für Technologieunternehmen, M. G. Möhrle and R. Isenmann, Eds. München: Springer. [11] Idaho National Engineering and Environmental Laboratory (INEEL), (2002) Applying Science and Technology Roadmapping in Environmental Management. Idaho: INEEL. Retrieved 7/11/02 World Wide Web [12] T. Kono, (1992)Long-Range Planning of Japanese Corporations. Berlin et al.: Walter de Gruyter. [13] R. N. Kostoff and R. R. Schaller, (2001) Science and Technology Roadmaps. IEEE Transaction on Engineering Management, vol 48, 2, pp [14] M. G. Möhrle and R. Isenmann, (2001) Einführung in das Technologie- Roadmapping, in Technologie-Roadmapping, Zukunftsstrategien für Technologieunternehmen, M. G. Möhrle and R. Isenmann, Eds. München: Springer. [15] R. Ornstein and P. R. Ehrlich, (1989) New World New Mind. Cambridge: Doubleday. [16] R. Phaal, C. Farrukh and D. Probert, (2001) Technology Roadmapping: linking technology resources to business objectives, University of Cambridge, Center for Technology Management. Retrieved 3/25/03 World Wide Web Session 4 TALES FROM THE FRONTIER 82

83 [17] G. Reger, (2001) Technology Foresight in Companies: From an Indicator to a Network and Process Perspective, in Technology Analysis & Strategic Management, Vol. 13, No. 4, [18] U. von Reibnitz, (1992) Szenario Technik. Wiesbaden: Gabler, 2. Auflage. [19] R. R. Schaller, (1999) Technology Roadmaps: Implications for Innovation, Strategy and Policy, Ph. D Dissertation Proposal, The Institute of Public Policy. George Mason University Fairfax, VA,. Retrieved 8/12/02 World Wide Web [20] P. M. Senge, (1990)The Fifth Discipline. London et al.: Century Business. [21] D. Specht and S. Behrens, (2001) Strategische Planung mit Roadmaps, Möglichkeiten für das Innovationsmanagement und die Personalbedarfsplanung, in Technologie-Roadmapping, Zukunftsstrategien für Technologieunternehmen, M. G. Möhrle and R. Isenmann, Eds. München: Springer. [22] G. Spur and F. L. Krause, (1997) Das virtuelle Produkt, Management der CAD-Technik. München et al.: Carl Hanser Verlag. [23] L. W. Steele, (1989)Managing Technology. The strategic view. New York et al.: McGraw-Hill. [24] J. M. Utterback and W. J. Abernathy, (2000) A Dynamic Model of Process and Product Innovation, in The Political Economy of Science, Technology and Innovation, B. R. Martin and P. L Nightingale, Eds. Cheltenham et al.: Edgard Elgar Publishing. Presentation 4 : A Conceptual View ROADMAPPING & SCENARIOS Introduction Methodological Outline RPD in the auto industry Conclusions I Roadmapping Preparation Industry Maturation II Analysis System Simulation Data M. III Scenario Projection A Projection KF B D Roadmapping Process CSCW C Market Maturation Design Process M. Modeling A Technology fields Prototyping KF B Standards Incumbents CAx Devices present C future IV Scenario Building V Assessment Time VI Mapping Road- II II IV II III II II 05 IV 10 II 15 III 20 II II IV IV I I III III Lizaso and Reger - University of Brandenburg, Germany EU-US 04 Session 4 TALES FROM THE FRONTIER 83

84 A Conceptual View ROADMAPPING & SCENARIOS Introduction Methodological Outline RPD in the auto industry Conclusions Roadmapping Preparation System Analysis Scenario Projection Scenario Building Time Assessment Projections- Portfolio on Technology Potential Road- Mapping PERFORMANCE DOUSING TIME - DOUSING Human-Machine Interfaces Virtual for for B Delikate ssen Design. on on Athe desktop Immerse prod. dev. C Full interact. desktop D - INTERACTIVITY III KF A B D C PERFORMANCE A B KF C present future A Conceptual View INTERACTIVITY TIME EU-US 04 ROADMAPPING & SCENARIOS Introduction Methodological Outline RPD in the auto industry Conclusions Roadmapping Preparation System Analysis Scenario Projection Scenario Building Time Assessment Sc IV Hig-Tech RPD Difficult to keep up Sc VII Core Rigidity Customer oriented Sc V Boutique Specialized markets RPD- Technology Scenarios Road- Mapping Sc II Virtual as Real Cars on request Sc I Work Benches Fertile platforms IV IV II II III II Sc VI High Performance Changing the rules Descriptors Sc. II Sc III Strategy Element Projection 1. Virtual Modelling Easy and 24Hs usefull Global aldea 2. Physical Modelling Useful, but expens 3. Product Calcul. Real time prod. eng. 4. Mfg. Planning Integrating to avoid Lizaso and Reger - University of Brandenburg, Germany EU-US 04 Session 4 TALES FROM THE FRONTIER 84

85 A Conceptual View ROADMAPPING & SCENARIOS Introduction Methodological Outline RPD in the auto industry Conclusions Roadmapping Preparation System Analysis Scenario Projection Scenario Building Time Assessment Sc IV Hig-Tech RPD Difficult to keep up Sc VII Sc V Core Rigidity Customer oriented Boutique Specialized markets Planning Horizon Scenarios From 05 to 20 Road- Mapping Sc II Virtual as Real Cars on request Sc I Work Benches Fertile platforms V II II 05 IV 10 II 15 III 20 Sc VI High Performance Changing the rules A Conceptual View Sc III 24Hs Global aldea Lizaso and Reger - University of Brandenburg, Germany EU-US 04 ROADMAPPING & SCENARIOS Introduction Methodological Outline RPD in the auto industry Conclusions Roadmapping Preparation System Analysis Scenario Projection Scenario Building Time Assessment Road- Mapping M Sc I Sc II Sc III... Sc VI Sc I Sc II Sc III... Sc VII T Sc I Sc II Sc III Sc IV Sc V P S&T Roadmap IV IV Scenarios (and Paths) Selection 3D-Scenario Matrix II II III III I I VI. II II IV IV I I III III Lizaso and Reger - University of Brandenburg, Germany EU-US 04 Session 4 TALES FROM THE FRONTIER 85

86 Paper 5 : Evaluation of Laboratory Directed Research and Development (LDRD) Investment Areas at Sandia KEVIN W. BOYACK AND NABEEL RAHAL 20 I. INTRODUCTION The Laboratory Directed Research and Development (LDRD) program at Sandia National Laboratories conducts world-class research on a variety of subjects that are relevant to Sandia s missions and potentially useful to other national needs. Much of the technology that has been developed at Sandia has its roots in the LDRD program. Research investment decisions made ten and fifteen years ago are having a direct impact on national security programs today. Sandia s LDRD program is divided into roughly a dozen different investment areas (IA s) including five that we focus on in this paper: Computational and Information Sciences, Engineering Sciences, Electronics and Photonics, Materials Science and Technology, and Pulsed Power Sciences. The LDRD process occurs annually at Sandia. First, staff members submit short ideas answering written calls (i.e. requests for proposals). Then, internal teams of experts in the technologies comprising each investment area review the short ideas and select some fraction of them for full proposals. The expert teams then review the proposals and select those to receive funding. LDRD projects have a maximum duration of three years. Continuation proposals and an annual review are required for each existing project that has not completed its term. In an effort to improve the inputs to the decision-making process, and given the availability of relevant data, we have embarked on a program to map our LDRD investment areas. We apply advanced information visualization tools to understand the historical development, validate strategic and tactical directions, and identify opportunities for future development for each of the five IA s mentioned above. This paper describes the project plan, detailed processes, data sources, tool sets, and sample analyses and validation activities associated with the mapping of Sandia s LDRD investment areas. II. PROJECT PLAN The original plan associated with this assessment activity consisted of several steps, which included ways to both benchmark our methods and deliver practical results. The first step was to create Sandia-specific visualizations of the IA s. The purpose of these visualizations was to identify past and present technological competences, and overlaps of competencies, within the IA s. Benchmarked was accomplished by comparing the visualizations with the mental models of IA leads experts who have, in the past, used traditional processes to understand their areas and make funding 20 Sandia National Laboratories* P.O. Box 5800, Albuquerque, NM Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under Contract DE-AC04-94AL Session 4 TALES FROM THE FRONTIER 86

87 decisions. Time was built into the plan to iterate the visualizations if large differences were found between them and the leaders mental models of their areas. Meetings with the IA leads were designed to not only benchmark the visualizations, but to educate the leaders, and add detail to their mental models. After completion of the benchmarking activity with the Sandia-specific visualizations, a second set of visualizations was created to include data on all federally funded R&D activities related to the IA s. The purpose of this set of visualizations, hereafter referred to as DOE LDRD, was to place Sandia s IA activities within a broader context, thus allowing for the identification of new opportunities by association with activities outside Sandia. These visualizations were also presented to IA leaders. Copies of the data, visuals, and navigation tools were also provided to IA leaders to allow them to explore the data independently. III. PROCESS, DATA & TOOLS Two different types of visualizations, each designed to provide different types of information, were created for this activity. The first can be described as a landscape map, which is particularly adept at looking for patterns and trends in large datasets. The second type is a link analysis map, which is valuable for identifying relationships within large datasets. The landscape maps were created using a process consistent with commonly accepted methods of mapping knowledge domains [1] (see Fig. 1): Appropriate textual records were identified and combined in a database. Latent semantic analysis (LSA) [2, 3] was used on the titles and descriptive text for each record to generate a document-document similarity matrix. A graph layout program, VxOrd [4], was used to calculate the document graph. The resulting graph or map was explored using VxInsight [5], a visualization tool that enables interactive navigation and query of an abstract information space. Link analysis maps were generated using ClearResearch, a product developed by ClearForest 21 that extracts entities (e.g. person, company, technology, product, university, etc) and relationships from unstructured textual sources. Using rules to 21 ClearForest, see Session 4 TALES FROM THE FRONTIER 87

88 define categories, ClearResearch produces link analyses at multiple levels of detail. The steps involved in producing these maps are as follows: The same textual records and database described for the landscape maps above was used here as well. A rule-based unstructured text tagging module was used on titles and abstracts to extract and categorize technology terms and organization terms (e.g. CIS, MST, PP, ES, EP). Technology and organization terms were linked together on a document basis and visualized in a network or link analysis map. Both types of visualizations, the landscapes, and the link analyses, were used for both the Sandia-specific and DOE LDRD analyses, as detailed below. 1. Data collection Two different sets of data were compiled from multiple sources for our analyses one for the Sandia-specific visualizations, and one for the DOE LDRD visualizations. The data for the Sandia-specific visualizations consisted of 1209 records from the five IA s and included the following: LDRD call text (i.e. request for proposals), descriptive text for both new proposals and continuation proposals, report abstracts for funded projects, and abstracts from peer-reviewed publications resulting from funded projects from FY through FY2004. Given that we are in the midst of FY2004, no project reports are available for the current year. These data are proprietary to Sandia, and are not available externally. To create the DOE LDRD visualizations, an additional ~4300 LDRD records from the entire US Department of Energy complex (including, for example, Los Alamos National Labs and Lawrence Livermore National Labs) for FY2001 and FY2002 were added. Of these, 180 duplicated existing Sandia-specific records and another 200 had no titles or descriptive text, and were thus removed from the data set. 990 of the new records had both titles and descriptive text, while the balance only had titles. With the Sandia-specific and additional DOE data, this set consisted of 5112 records. 22 FY = fiscal year, which goes from October 1 through September 30 Session 4 TALES FROM THE FRONTIER 88

89 2. Similarity calculation Latent semantic analysis is a technique based on the vector space model that has found recent application in information retrieval. It is a technique that can represent aspects of the meanings of words, and effectively deals with synonymy and polysemy. Traditional LSA uses the singular value decomposition (SVD) technique to deconstruct a termdocument matrix into the product of three other matrices, with {X} = {W} {S} {P}', where {S} is the matrix containing the singular values. This matrix is then truncated to the highest ~300 singular values. To calculate the document-document similarity matrix, matrices {W} and {S} are multiplied. The resulting vectors are then normalized to unit length, and the inner products are calculated. These inner products are the document-document similarity values. Our LSA methodology differs slightly from that above in that we use semidiscrete decomposition (SDD) [6, 7] rather than SVD to do the term-matrix deconstruction. Although this typically reduces the precision by a small amount (~2%), it is much less memory intensive due to its discrete nature, and runs easily on a PC. We also used an optimized stopword list prior to construction of the initial term-document matrix. This stopword list was designed to allow the LSA to focus on technical content, and thus removed common words, many verbs, adjectives, and adverbs, and words that were tied closely to only one type of document (e.g. project, proposal, report, etc.) LSA generates a full n x n similarity matrix. Our experience with many data types and sets indicates that use of the full similarity matrix is not necessary. Rather, use of the top few similarities per record is sufficient to characterize the map. Thus, we used only the top 15 similarities per record to generate the landscape maps. 3. Ordination Ordination using the similarity files generated from LSA was done using VxOrd, a force-directed graph layout algorithm that preserves both global and local structure for a range of graph sizes (1k 1M nodes). VxOrd has been used for many different types of maps with good success [5, 8-10]. This step is referred to as ordination rather than clustering because VxOrd generates x,y coordinates for each record (calls, proposals, report, etc.), but does not assign cluster numbers. The ordination places similar documents close to each other on the graph. 4. Visualization using VxInsight After calculating coordinates, a data set is loaded into VxInsight for exploration and analysis. VxInsight is a tool that allows visualization and navigation of an abstract information space, such as large document set. It uses a landscape metaphor and portrays the structure of the space as peaks and ridges of documents. The size of a peak and its relative position in the layout provide valuable clues to the role of that group of documents in the overall structure. Labels on dominant peaks are based on the two most common words in the titles (or other fields) that comprise that peak, thus revealing the content of the various peaks. Users can navigate the map terrain by zooming in and out, querying metadata fields (e.g., titles, abstracts, etc.), or by restricting the data displayed to a certain time span and sliding through sequences of Session 4 TALES FROM THE FRONTIER 89

90 years with a slider. Relationships among the data may be displayed as arrows between documents and understood at many levels of detail. Detail about any data record is also available upon demand. IV. ANALYSIS AND DISCUSSION 1. Landscape mapping of IA s As mentioned above, one of the purposes of generating a Sandia-specific map of IA s was to benchmark the map against the mental models of IA leads. One such exercise is described here for the Computational and Information Sciences (CIS) investment area. During our meeting with the CIS area leader, we first gathered information about the lead s mental model of 1) the CIS area, and 2) perceived overlaps between CIS and the other four areas mapped, and then we presented our maps to the individual. A graphic describing the CIS IA lead s mental model of the overlap between CIS and the other four areas is shown in Figure 2. First, the lead perceived that there was a significant fraction of the CIS space that was unique to the investment area, and thus had no overlaps. The two largest perceived overlaps were between CIS and the Engineering Sciences (ES) and Materials Science and Technology (MST) areas, each accounting for a significant fraction of the space. The perceived overlaps with the Electronics and Photonics (EP) and Pulsed Power (PP) areas were much smaller, but were thought to be increasing with time. Potential three way overlaps were not considered. The Sandia-specific IA map, generated using the process described above, is shown in Figure 3, and uses the same color scheme for the different IA s as that shown in Figure 2. The overall layout and distribution of the five IA s is shown in the middle left pane (labeled F) of Figure 3. Here, the main area for CIS is the major peak toward the upper left. ES tends to form a bridge between CIS and MST. MST is divided into two main components, one near the middle of the map, and one near the lower right. EP likewise has two main components, one near the upper right, and one at the middle right, while PP is focused at the far middle right. MST seems to have the most central position of the five IA s. Session 4 TALES FROM THE FRONTIER 90

91 Detailed overlaps between the IA s are shown in the five blown up regions of the map, with particular attention to those areas overlapping with CIS. These are marked with two-colored circles to highlight the overlap. Areas occupied solely by CIS are found in pane A (solid white circles), and correspond well to the mental model s view of unique space. Two significant overlaps between CIS and ES are shown in panes A and E, mainly dealing with algorithms and transport phenomena. Several smaller regions of overlap between CIS and MST are shown in panes A, B, D, and E, all having to do with microsystems and related technologies. These two areas, ES and MST, show the greatest overlaps with CIS, which directly in line with the mental model of Figure 2. EP shows two small areas of overlap with CIS, in panes C and E, while PP shows only one overlap, in pane C. This also matches the mental model quite well. Session 4 TALES FROM THE FRONTIER 91

92 One area of particular interest on the map is that found in the center of pane E, where four of the IA s overlap (all but PP). This suggests that all four IA s could share and benefit from joint calls and proposal review in the central subject of this region of the map, microsystems and related materials. Using the VxInsight time-sliding capability, we investigated trends in the IA overlaps, some examples of which are mentioned here. The extent of overlap between CIS and ES has remained roughly constant over the period from 2001 to 2004, with the areas of focus shifting towards optimization of algorithms. The conceptual overlap between EP and MST has increased significantly in the past two years, especially in the area of integration for product application. Shifts in focus in the individual IA s can be seen as well. For instance, a portion of the EP portfolio dealing with MEMS has shifted from component integration to applications. Another significant outcome of the meeting with the IA leader was his desire to put the VxInsight tool and data sets on his PC so that he could explore the data independently and draw his own conclusions related to both assessment and potential future directions. 2. Link analysis of IA s The analyses of the visualizations in the previous section tend to strongly convey the patterns and trends occurring within the IA s. However, specific information indicating the relationships between technology and IA and the explicit nature of the relationships between the technologies are still hidden. In order to extract the hidden relationships within the landscape visualization, many hours of exploring, including reading abstracts, would be required. An alternative approach to tedious review is the development of a link analysis map coupled with an unstructured text tagging rulebook. The link analysis map was crucial in portraying to the IA leads the direct and indirect relationships that occurred between technologies within their IA s, as well as relationships that occurred between all five IA s. This analysis added value in that the IA leads obtained information that assisted them in the evaluation and redirection of their R&D activities. The first level of analysis consisted of identifying relationships between technology and multiple investment areas. The relationships exposed by this analysis were intended to reveal potential overlapping or complementary technology spaces that can be jointly leveraged in future LDRD calls. Figure 4 is an example of the link analysis visualizations that were created and shared with the IA leads. Common technologies that indirectly link two (or more) IA s appear between the IA s, showing direct links between a technology and the associated IA s. Darker lines indicate stronger relationships. Technologies that are unique to an IA are depicted by the collection of links that extend out from each IA label. These are not shown in the figure to focus attention on the overlaps. The actual visualizations reviewed by the leads were often more detailed, using lower linking thresholds. The first level of analysis identified a macro-scale understanding of the overlaps as well as the unique competencies and capabilities that each IA possessed. This understanding was then used as a validation model for the IA leads. Figure 4 indicates Session 4 TALES FROM THE FRONTIER 92

93 that each investment area has a robust set of unique technologies indicated by the unlabeled lines extending out from the IA markers. This unique set of technologies represents the development of a strong and innovative R&D portfolio. The figure also validates the proper roles assigned to each investment area. For example, MST idealistically should support EP, PP, and ES, with very little support to CIS. The rationale behind this is that MST provides the expertise in materials for the development of devices in EP and PP; however MST needs the simulation expertise that resides in ES to develop materials, and ES needs the hardware and software expertise in CIS to develop and apply simulations. Figure 4 is a visualization of the current relationships, which seems to be consistent with the ideal state mentioned above. The second level of analysis consisted of the identification of specific relationships between investment areas. Figure 4 depicts a very strong relationship between EP and MST. The darker color of the links between EP and MST indicate a strong potential collaboration based upon MEMS (micro-electro-mechanical systems) and lithography. In addition, optical detection, communication, optoelectronics, and remote sensing should also be taken into consideration as potential areas of collaboration. As a result of the findings above, it was advised that EP and MST work together to identify a collaborative approach to future LDRD calls. In addition, it was advised that the investment area leads create LDRD calls in the future that have a funding pool for joint EP and MST proposals. Session 4 TALES FROM THE FRONTIER 93

94 The third level of analysis consisted of a technology-to-technology relationship assessment within a single investment area. The assessment was used to assist the investment area leads in portfolio management activities. The visualization contained very detailed (and thus proprietary) information, and is not shown here. The result of the visualization pointed to specific technological efforts within an investment area that could be combined to create a larger effort that could in turn attract future funding outside of the LDRD program. In addition, the IA lead was able to identify, compare, and leverage objective technological strengths to attract new external customers. 3. Landscape mapping of DOE LDRD A map of the DOE LDRD data set was created using the same technique described previously, and is shown in Figure 5. The purpose of this map was primarily to identify additional opportunities by comparison of Sandia IA data with work of national interest that is being funded at other DOE laboratories. The roughly 3800 records added to the Sandia IA data add significant context and content that provide fodder for new ideas. It is worth noting that the visualizations themselves do not generate new ideas. Rather, it is Session 4 TALES FROM THE FRONTIER 94

95 the analyst or IA lead interacting with the visualizations that formulates questions and new ideas based on the information and patterns seen there. Figure 5 shows the overall landscape comprising investment in LDRD by all of the U.S. DOE s laboratories. Significant areas of the map are not covered at all by any of the Sandia IA s. These are not the areas of interest to Sandia since the map indicates that they are well outside our core competency areas, but are obviously within the competencies of another laboratory. We are more interested in new opportunities in areas very related to our own competencies. The lower pane of Figure 5 shows detail of the lower left region of the DOE LDRD map in the area dominated by Sandia s CIS investment area. All of the non-sandia records have been marked as red dots. Examination shows several small clusters of data in areas that are very related to our computational competencies (e.g. the areas marked by yellow circles: climate computing, nonlinear mathematics, computational biology), and that are potential areas of future opportunity for the CIS IA, given it s current portfolio and competency base. Some of this was anticipated by the CIS investment team in that the FY2005 calls (issued in March, 2004) reflected an increased interest in informatics, of which computational biology is one type. 4. Link analysis of DOE LDRD The Sandia-specific link analysis assisted in the understanding of the technologies within and the relationships between the technologies from different IA s. The next step was to take the localized knowledge extracted from the investment area analysis and compare the strengths and weakness with the rest of the DOE complex. The first analysis in this section consisted of only using LDRD projects, in addition to rolling up all of the investment areas to an overall Sandia category. The second analysis consisted of analyzing each investment area in the context of the entire DOE complex. The data used for this analysis consisted of LDRD calls, proposals, and projects for the investment areas, and LDRD projects for the DOE complex. The link analysis visualization for the entire DOE complex is represented in Figure 6. Although there are several labs in the original analysis, only the strongest links between technologies and labs were extracted and visualized. Figure 6 identifies the relationships between labs and technology, and thus, labs with common technology competencies. For example, Lab B has an area of common technical focus with Lab A through lithography, Lab C through fuel cells and biological systems, and Lab D through biological systems and semiconductors. The identification of these common points directs us to technology categories that can be further analyzed to identify the portfolio of technology that characterizes the capabilities of each lab. For example, when clicking on the fuel cells node in Figure 6, a large number of additional relationships appear. The relationships consist of additional labs and technologies that have weaker links than in the original visualization. Drilling down into a technology is a powerful analysis technique, and provides greater detail for the laboratory and IA s. The value of this analysis lies in its ability to identify the technological capabilities of each lab, in addition to determining whether duplication or collaborative opportunities exist. The second analysis consisted of linking each individual IA to other laboratories in the DOE complex through common technologies. The analysis was conducted by selecting Session 4 TALES FROM THE FRONTIER 95

96 each IA in turn and exposing all laboratory and technology relationships associated with it. The result was a visualization that placed the IA in the middle of the link map with a minimum of fifty nodes identifying direct and indirect relationships. The direct relationships were explored to identify duplication or complimentary efforts. The indirect relationships were explored to identify complimentary technology outside of Sandia, and thus to assist in the identification of new but related applications outside of Sandia s original intended use, or to suggest potential collaborative opportunities between laboratories. V. FUTURE DIRECTIONS This is the first year that we have applied such analyses to our LDRD process. Coming late in the annual process, the results have been more modest than they could have been. We plan to start a similar process for the FY06 LDRD process, and carry it out much sooner in the annual process. We have learned that it is important not to saturate the IA leads with the information from these analyses, but rather to present some information, and then allow them to further explore the information on their own. It is only as those with funding authority internalize the results of such analyses, integrate them into their mental models, and foresee how overlaps, collaborations, and new opportunities can benefit the return on investment to their IA s, that they will put the results into practice. We have also learned that one tool does not fit all, but that different approaches offer different perspectives and levels of detail that are all of benefit to the analyst or manager. Session 4 TALES FROM THE FRONTIER 96

97 In the future, we also plan to investigate different models of impact and join the best of those to our visualizations in an attempt to provide further data to answer questions related to return on investment. Session 4 TALES FROM THE FRONTIER 97

98 VI. REFERENCES 1. Börner, K., C. Chen, and K.W. Boyack, Visualizing knowledge domains. Annu. Rev. Inf. Sci. Technol., : p Deerwester, S., et al., Indexing by latent semantic analysis. J. Am. Soc. Inf. Sci., (6): p Landauer, T.K., D. Laham, and M. Derr, From paragraph to graph. Proc Natl Acad Sci USA, (Suppl. 1): p Davidson, G.S., B.N. Wylie, and K.W. Boyack. Cluster stability and the use of noise in interpretation of clustering. in 7th IEEE Symp Inform Visual (InfoVis 2001) San Diego, CA. 5. Boyack, K.W., B.N. Wylie, and G.S. Davidson, Domain visualization using VxInsight for science and technology management. J. Am. Soc. Inf. Sci. Technol., (9): p Kolda, T.G. and D.P. O'Leary, A semidiscrete matrix decomposition for latent semantic indexing in information retrieval. ACM Transactions on Information Science, (4): p Dowling, J., Information retrieval using latent semantic indexing and a semi-discrete matrix decomposition, in School of Computer Science and Software Engineering. 2002, Monash University. p Boyack, K.W. and K. Börner, Indicator-assisted evaluation and funding of research: Visualizing the influence of grants on the number and citation counts of research papers. J. Am. Soc. Inf. Sci. Technol., (5): p Kim, S.K., et al., A Gene Expression Map for Caenorhabditis elegans. Science, : p Boyack, K.W., K. Mane, and K. Börner. Mapping Medline papers, genes, and proteins related to melanoma research. in Information Visualisation submitted. London. Session 4 TALES FROM THE FRONTIER 98

99 Presentation 5 : SESSION 4 TALES FROM THE FRONTIER 99

100 SESSION 4 TALES FROM THE FRONTIER 100

101 SESSION 4 TALES FROM THE FRONTIER 101

102 SESSION 4 TALES FROM THE FRONTIER 102

103 Paper 6 : Dynamic monitoring of future developments Maurits Butter (TNO-STB) Jan Pieter Mook (Dutch Ministry of Economic Affairs) Abstract A crucial element of foresight studies is the gathering and analysis of future developments. Often this forms the fundament for the identification of strategies and their translation to actions. However, the systematic gathering is time and budget consuming. Existing information sources are not suited to the specific alternative questions and they are static of nature. New intensive analysis of the existing material is needed, or even new fieldwork. The Dutch Ministry of Economic Affairs, in co-operation with TNO developed a new foresight approach, called Dynamo. In this approach issues and innovations, representing demand and supply of our society, are collected systematically and stored in a relational database: The Dynamo database. The collected data forms the basis for interactive discussions with stakeholders, initiating brokerage and inspiration for new business opportunities and policy. The collection of concrete information elements builds up the core of the system, but the objective of the expert system is to draw conclusions at a meso-level. Using a fingerprinting method, the information is made dynamic and can be used as a core source for many projects. This fingerprinting uses international accepted classifications for industry and research, and a tailor-made classification for the consumer & social side. In this way, industrial sectors, research areas and consumer/social demands are linked to specific innovations and issues. This enables four types of results: 1. Straight outputs, based on research and business 2. Generation of themes, or clusters of innovations 3. Overview of innovations relevant for individual users. 4. Cross-correlation between research and business on possible areas for co-operation. The database uses different sources as input. A first category is the systematic analysis of foresight studies (secondary validated sources). Also national and regional research programs are used as a source. Another, more expert oriented source is formed by individual S&T watch activities, like NOST. At this moment, the database is a core element of five projects. More than 1200 topics are gathered. One of the most important projects is the Dynamo 2004 exercise, where three Dutch public funding agencies (Senter, NWO and STW) are using their project portfolio to give input. Within the framework of this project in April 2004, a public Dynamo Theme Day is held to discuss with business, research and government the opportunities of the Dynamo approach and the database. The paper will present more of the backgrounds of the Dynamo-approach and the Dynamo-database, including the elaboration on the concept of the system. Also some results of the Dynamo 2004 exercise will be described, including the results from the Dynamo Theme Day. SESSION 4 TALES FROM THE FRONTIER 103

104 1. Why Dynamo? 1.1 Future studies and policy Future studies constitutes a systematic attempt to observe the long term future of science, technology, society, the economy and their mutual interactions in order to generate knowledge with which to effect social, economic and environmental improvements based on well founded projections. With this statement Mr. Ramon Marimon, Spanish secretary of the state for Science and technology opened his preface to the Conference Proceedings of the Foresight conference held in Spain 2002 [Marimon, 2002]. It is clear that future studies can support the coherent development of research and innovation policies on a national and EU-level. They deliver economic analysis, technological indicators, future arrangements and demands in the field of science and technology. But beside the information aspect, it can also inspire and facilitate the alignment of thoughts, investments and efforts of the stakeholders involved. In this way, a more effective and efficient policy can be developed to address the social issues we are facing today and tomorrow. From the initial experiments and exercises for national science and technology policies in a few countries, the range of future studies has enormously increased, especially for countries in the periphery of S&T, for specific industrial and policy sectors, for regions, for organizations and for specific technologies. The number of possible tools to do future studies has increased as well. Since the early years of future studies, the focus has shifted to other methods, like scenario studies, road mapping and participative methods; methodologies to improve the processes of foresight and the creation of commitment to process and results. Another reason is that foresight increasingly gets linked to other tools for strategic policy making, such as evaluation and technology assessment, thus adopting their approaches [Kuhlman, 2001; Rip, 2003]. Traditional methods of forecasting are adopted more and more, even though the aims differ and results and their interpretation often serve other purposes [Cuhls, 2003]. As a result, policy makers more and more are confronted with the question how to design the appropriate foresight process to maximize their effect in terms of initiated action. Experience shows that the actual effect of future studies is not optimal. A recent study of the UK Foresight program shows that the Foresight projects in the UK focuses on the foresight processes and only some degree of systematic attention is given to the implementation of the results [Miles, 2002]. The Technology RADAR assigned by the Dutch ministry of Economic Affairs showed limited effect on implementation and innovation. Also the process orientation initiated by foresight is loosing some momentum and in favor of more and better need for content, although stakeholder involvement is still considered crucial. These notions stimulated the Dutch Ministry of Economic Affairs to develop Dynamo. This new future oriented approach, puts systematically and economically gathering of content information in the centre and gives priority to the use of results, both from the industrial and governmental perspective. This paper describes the Dynamo approach and presents its first results. SESSION 4 TALES FROM THE FRONTIER 104

105 1.2 Systematizing future studies in the Netherlands23 Dutch science and technology policy has experiences with future studies for a long time. From the seventies onwards one can find successful and unsuccessful foresight exercises and programs within Dutch science and technology policy. Most of the studies were done by ad hoc panels focussing on developments within a specific area, discipline or technology. In the mid-seventies the first attempts to foresight were initiated as part of the objective to link scientific research to social issues. Verkenningscommissies Future studies in the Netherlands Future studies are only one entrance to the history of Dutch science policy. An overall picture is beyond the scope of this article, but it should be stressed that the way future studies developed and is organized more or less reflects typical patterns of the Dutch research system. The Netherlands' research system has fewer top-down characteristics than France and the UK, or even Germany. 1 There are a great many institutions, councils, and also external but-still-to-be-taken-into-account bodies, which contribute to agenda-setting, mediate between resource allocation and performance of research, and oversee parts of the research. All national research systems have, by now, an "intermediary level" between the "top", i.e. the State with its responsibilities for funding, structures of the system, and authoritative goals, and the "bottom", i.e. the research performing institutions. In the Netherlands, the intermediary level is crowded, and it appears to function well (even if a lot of time has to be spent on consultation and mutual accommodation) (Van der Meulen and Rip, 1994). were installed to advise about research on education, science for policy and spatial planning. Although the reports invoked some discussion on the development and organisation of these fields of research, the effects were limited. From these initial attempts three strands developed which are important for the development of foresight in the Netherlands and which still continue. 1. In the first strand, foresight became one of the tasks of sectoral advisory councils for research. A National Council for Agricultural Research already existed and new ones were modelled after 24. The main reason of success for these institutions is that they provide for interaction, mutual positioning and agenda building. In the Dutch political and scientific culture, this has given them a viable role, despite some criticism that reports are too general and not implementable. The COS 25 is now coordinating these activities. 2. The second strand of foresight studies developed within the context of technology policy. Initially, in the early eighties, technology foresight was implicit within the selection and preparation of innovation oriented programmes (IOPs). These national programmes aim to stimulate strategic research in promising technological areas. 3. A third strand of foresight developed mainly within the context of a science policy for the sciences, but appeared to be sensitive for the changes in the policy relation between the government and the universities. At the end of the seventies science policy became more oriented to disciplines and `verkenningscommissies' on chemistry, physics, biochemistry, biology and later on biofysics and mathematics were installed. These committees consisted mainly of academics and focussed in their reports on university research. Although the Ministry pressured each committee to be selective in its claims and outward looking, the committees hardly set any priorities and concentrated on university research. These activities now take place under the supervision of the KNAW 26. In the late eighties technology foresight became a separate activity, aiming at improvement of the technology policy of the government and awareness of new technological developments among the 23 This paragraph is based on an article by Barend van der Meulen, member of the Dutch Foresight Steering Committee [Van der Meulen, 1996]. 24. The National Council for Agricultural Research existed since the mid fifties as a 'participants organisation', i.e. an organisation set up jointly by government, researchers, and users of research, rather than a council established by the government. It has now transformed to the Innovation Network Rural Areas and Agricultural Systems. 25 The Consultative Committee of Sector Councils for research and development 26 The Royal Netherlands Academy of Arts and Sciences SESSION 4 TALES FROM THE FRONTIER 105

106 industry. Within technology policy foresight is linked to issues like `globalization', the international technological competition and to the support of technological innovation of SME's. In the early nineties the integration of these future studies has become an issue because of the objective to set national priorities and posteriorities for public S&T efforts. In 1992 a Foresight Steering Committee has been installed by the then Minister of Education and Sciences to coordinate future studies and integrate results. Nevertheless, heterogeneity in foresight studies continues as the information needs of the actors within the field foresighted are still more important than the need for integration of results. Although more specific foresight studies are carried out regularly at various organisations, institutes and department in the Netherlands, it has been since 1998 that the Technology Radar has covered all the whole technology spectrum. The themes resulting from the Radar have been studied and have resulted in policy measures. Today, future studies are carries out in a fragmented approach. On a smaller level the Foresight Steering Committee still carries out field focused future studies and other ministries carry out their own specialized future studies (e.g. Weterings, 1997) and the activities of the Sector councils still are future study oriented. Although many themes mentioned in Radar are still actual and relevant today the underlying information is not and so, in 2002, a need for a new and systematic foresight system at the Dutch Ministry of Economic Affairs was expressed. 2. The Dynamo approach 2.1 Historical background of Dynamo After the experience of the Technology RADAR and other international foresight programs, the Dutch Ministry of Economic Affairs came to the conclusion that the effects of a large Foresight Program were limited, and much information was already (internationally available). On the other hand, there was a need to anticipate on future developments, both from an industrial and governmental perspective. To The Netherlands and innovation policy address this problem, early 2003 The Dutch innovation policy agenda aims, amongst others, actively support the Dynamo 27 approach was collaboration initiatives of the universities, technological institutes and business to introduce innovations to the marketplace. These initiatives, usually a mixture of different scientific disciplines and industrial sectors, focus on technological developed as part of a foresight process in order to gain a better view of the relevant national themes. and international innovative developments at a meso 28 or theme level. Parallel to this, in 1999 TNO developed an innovation database based on an assignment of the Ministry of Economic Affairs, within the framework of the National Environmental Programme 4. The overall objective of this project was to identify the potential of technology to solve the persistent environmental problems, including the policy needed [Butter, 2000]. To manage the information overload experienced, the database was developed linking technological systems to the persistent environmental problems. This was integrated with the information from a previous environmental foresight, done by TNO in 1996 [Weterings, 1997]. The database proved to be a core element in the economically and dynamic monitoring and analysis of future developments. However, to ensure continuity, consistency and access on the long term, more than a 1 year assignment was needed. In 2003, TNO and the Ministry saw the opportunity for mutual benefit and created a joint venture to further develop the information management system into an expert system, also able to make analysis. TNO concluded that further development was both scientifically and commercially interesting. Priority was given by TNO to further develop the 27 Dynamic Monitoring 28 Meso meaning at a branche or dynamic cluster level SESSION 4 TALES FROM THE FRONTIER 106

107 Dynamo database to a strategic foresight tool 29. The Ministry of Economic Affairs decided that the joint venture could provide a multi user platform, where economies of scale could ensure multi stakeholder involvement and commitment of industry and research to the Dynamo approach. 2.2 The Dynamo concept The overall aim of Dynamo is: To facilitate economic development, by offering concrete information about possible present and future innovations and issues. Dynamo puts content in the heart of the process. This overall objective is divided into the following objectives: To encourage and inspire industrial and research stakeholders into new economic activities, by showing present and possible future themes of research and development with an interactive approach. Project portfolios Future studies Expert workshops Inspiration Brokerage Policy analysis To identify possible new areas of co-operation in business/business, business/research and research/ research, as well as broker between supply and demand in the field of innovation. To facilitate these objectives by offering a platform for dissemination and networking using existing information. Feeding the policy process with future oriented knowledge, innovation strategy based on analysis on the gathered information (like research areas, issue analysis, stakeholder analysis, etc.). The Dynamo approach was developed with the following criteria in mind: 1. Multi-stakeholder and multi need: Whereas most systems target single users the output of Dynamo has to accommodate different users with different needs. 2. Dynamic and flexible: Rather than a paper report, losing value over time, Dynamo should develop the content and information continuously. Targeting different stakeholders and, just as importantly, developments in policy, the system therefore has to be able to render the information flexibly in various formats. 3. Economical and added value: The process has to be economical, using information from other sources were it can, but providing added value. 4. Autonomous and appreciated: Other stakeholders, users of the system, should in turn for use participate in keeping the information up-to-date. This will obviously also add to the economical aspect of the system. Naturally, when it wants to be mostly autonomous Dynamo has been gain respect and appreciation. 2.3 The Dynamo process The Dynamo approach combines the interactive meso networking with activities on a micro level. The basis of the approach is formed by a continuous gathering of information about important future developments that could be of interest to the Dutch economy. It collects data on the micro level and draws conclusions to the meso level. The information collected is used as a starting point for theme oriented brokerage meetings and to initiate policy discussions. 29 Within TNO, the Dynamo database is/was used in 7 projects and a specialized team is formed to ensure continuity and quality. SESSION 4 TALES FROM THE FRONTIER 107

108 The Dynamo approach can be divided into three elements 1. Data collection and input 2. Inspiration and brokerage 3. Policy analysis 1 Collection / input 2 Brokerage Knowledge & inspiration 3 Results & Policy analysis Business Business External Studies Open input etc. Collection Brokerage Results External advice Research Research various stakeholders Knowledge & inspiration Knowledge & inspiration In the Collection phase data are gathered from various available sources. A criterion is that the collected data is verifiable, so sources like newspapers and individual ideas are not included. The main types of sources are: Delphi surveys and other futures studies. Expert workshops Portfolio analysis of funding agencies, trade organisations, or RTOs (e.g. universities, TNO). The Internet is used to fine-tune the existing data and is not used as a source. Various data sources The data gathered in Dynamo database can originate from several sources, depending on the user. For example, the first data included was based on the analysis of 5 Delphi surveys and included more than 400 innovations, and was based on the so-called Dynamo 2003 exercise. The second input was done within the Science Forward Look project and included some 250 issues, gathered through expert interviews and desk research. The Dynamo 2004 exercise used information from government funding projects at NWO and Senter, and inputted a total of roughly 500 innovations into the system. As one of the core functionalities and objectives of the Dynamo approach is the dynamization of information, all data collected is centralized and made available in a flexible way. The concept is to modular gather and use data for specific demands, and to add existing data. Also the input made for the specific demands is made available for future demands. This makes the usage for other projects possible, as well as the enrichment of the core data. Of all data collected the sources are linked, enabling systematic activation to specific demands. In the next phase, Clustering and Brokerage, the data collected is presented to a large group of stakeholders during a clustering & brokerage workshop. Previous to this workshop the database generates so-called themes or innovative clusters. These themes consist of clusters of innovations from the collection phase which have a similar characterisation (this will be discussed later in SESSION 4 TALES FROM THE FRONTIER 108

109 chapter 3 database ). The participants are invited to judge and possibly comment on the data, but, more importantly they will discuss innovations, themes and trends amongst each other. This discussion is a goal in itself, it acts as a brokerage whereby stakeholders can be inspired and design innovative initiatives. Individuals have access to a subset of the data in an interactive way. They can get inspired by the presented data, but also are able to find the organisations and individual persons linked to specific data. This facilitates the individual identification of new business opportunities and offers a brokerage to initiate networking. In the last phase the policy implications from the previous activities and the data collected are analysed for internal (Ministry) and external use. The collected data can be used as a starting point for in depth analysis of innovation areas, like possible new surfacing areas, possible new networks and other policy relevant information. Also the functionality to transparently present governmental funded projects is an asset to the approach. 3. The Dynamo database 3.1 The data model The core element of the Dynamo approach is the systematic collection and storage of data. To address the criteria of dynamization, flexibility and multi user, a relational database is used, called the Dynamo database. The general concept of storing this data into the database is to look at innovation classically as a linear process. science innovation marketplace/ demand / An innovation can be seen as a product or sectors social needs process successfully introduced in the area 1 sector 1 need 1 marketplace resulting from a new area 2 sector 2 need 2 area 3 sector 3 need 3 combination of developments in scientific area 4 area 5 R&D sector 4 sector 5 need 4 need 5 application in marketplace areas. Using this scheme it would result in the figure on the left. It shows a combination of scientific areas resulting in an innovation that is introduced into the marketplace through, possibly, different business sectors in order to fulfil, various (consumer) demands / needs. The concept of the Dynamo database is that future studies focus on possible technological, organisational, institutional and cultural changes in our society and includes the context in which these changes take place. So, future studies are not limited to the changes, but also about the factors and actors that are related. This is made operational in the database by focusing on changes and characterizing these changes using contextual dimensions. Industry Industry The core element of the database is formed by a set of innovations and issues (called Research Innovations Issues Research topics), which stands for the demand and supply side of our society. As the aim of Dynamo is the brokerage and the analysis Society Society on a meso level, the character of the topics is multi-actor at micro level. The topics can be seen as the linking pin between business, research and other societal organisations and are the focal points towards action (both problems to be solved and the solutions). These issues and innovations are fingerprinted, using predefined classifications. The philosophy is that these classifications create a profile of the issues and innovations. SESSION 4 TALES FROM THE FRONTIER 109

110 A dimension can be defined as: a structure to classify the context of a topic from the perspective of technical, organizational, institutional or cultural improvements in a systematic way. Based on the three basic strands of the triple helix concept (universities-industry-state), the fingerprinting uses three core dimensions [Leydesdorff, 2001]: Industry This dimension represents the industrial side of our economy. To classify Business, the NACE code is used, which is the EU accepted industrial classification of industry [see RAMON]. This classification is included in a three digit approach. Research The research dimension represents the public research related infrastructure, including universities and semi public RTOs. To classify the research infrastructure, the Australian Standard Research Classification (ASRC) is use, which is based on the Frascati manual [OECD, 1998; ABS, 1998]. This classification is included in a three digit approach. Society The third side of societal changes is formed by the social needs of citizens. They represent the changing markets and consumer needs. To classify this dimension, a combination of the Maslow s Hierarchy of Human needs [Maslow, 1943], a translation to economic needs [Weterings, 1998] and governmental domains is used. 30 Three interacting dynamics (knowledge production, markets-i.e., diffusion and control) can be expected to generate innovation. By identifying the relevant parts of these institutional entities, the relevance of the topic for industrial activities, universities, consumer needs and governmental action can be identified. An integrated optoelectronic chip probe for direct blood flowmetry (example of innovation) Laser Doppler blood flowmetry is a non-invasive optical technique to measure the blood flow in skin and other human tissue. The spectrum of certain fluctuations yields information about the concentration and velocity of moving red blood cells. The innovation will allow the discrimination between flow in shallow and deeper tissue layers or between flow in small and larger blood vessels. From a medical point of few such discrimations are very highly desirable. This innovation is an integrated laser Doppler probe which means that one silicon wafer will contain one or two lasers, a detector, electronics. This probe will be much smaller than the present ones and have a higher spatial resolution. The absence of fibres will largely eliminate the disturbant effects of patient motion. The innovations are defined as: New application of knowledge and/or technology in a potential commercial way [OECD, 1996]. These innovations are systemic of nature (no bolts or nuts) and need, or needed substantial investments (financial, or HMR) of several organisations. They must be usable in a broad market. Next to a title and short description, specific characteristics are included in the database, like: Time horizon for market breakthrough. Level of innovation (e.g. incremental, radical, system change). Type of innovation (market, technical, organisational, cultural) Phase of development (fundamental research, development, fabrication, communication, adoption). Diversification of farming: new species: llamas, ostriches, deer (example of issue) Johne's Disease in camelids; need recognized for establishment of protocols for the study of certain diseases in camelids; Valid diagnostics are required by world trade agreements to ensure rapid and safe movement of livestock. In regards to llamas and alpacas, bovine brucellosis, vesicular stomatitis (VS), bluetongue (BT) and epizootic hemorrhagic diseases (EHD) are priority concerns: importance of conditions under which ostriches are raised for health and welfare: danger of extrapolating from other species These innovations can also include organisational changes, new markets and other innovative changes. The issues are defined as: Societal problems that must be addressed on a programme level. The issues are also systemic of nature (not project related knowledge questions, or 30 This dimension is still under construction. SESSION 4 TALES FROM THE FRONTIER 110

111 project objectives). Also the issues need more than one organisation to solve, including research, business, government and other societal stakeholders. Of each issue, also a title and short description is included in the database. The same characteristics are profiled. The phase of development is adjusted according to more issue related categories (recognition, agenda setting, policy development, policy implementation, monitoring) 3.2 Input and gate keeping The most important element of the database is continuity, consistency and sustainability of the data. TNO is responsible for the structure and content of the database, but it is impossible for a single organisation to include the topics, including fingerprinting. The modus operandus is that other organisations (third parties) offer suggestions for new topics and TNO gate keeps the data. The data collection is divided into the following stages: 1. Input by third party 2. Gate keeping by TNO 3. Acceptance or rejection. 4. Approved by the Dynamo Steering Committee The data included is evaluated periodically. In this way, the quality of all information is ensured. Also evolution of future innovations is monitored, enabling historical analysis. TNO has set up a Dynamo Gatekeeper Team that evaluates all entered information. At this moment, the team has about 5 members, so medium term consistency is ensured. Also internal procedures and manuals for gate keeping and data entry are developed to enforce consistency. 3.3 Four types of results As the information is gathered in a dynamic way, the output generated can be divers. The simplest output is transposing the database; instead of looking at the data from the perspective of the included topics and presenting the related classes, a specific class can be selected and the corresponding topics can be presented. This enables answers to questions like: What kinds of future innovations are relevant for the food industry in general, or Production, processing and preserving of meat and meat products in detail? What kind of issues are in need of innovation looking at the Industrial Biotechnology and Food Sciences? What social needs are innovation related and what kind of innovations can help solving environmental problems? All output can be fine-tuned using the detailed characteristics of the topics (timeframe, type, etc.). This enable short, medium, long term views, as well as other interesting detailed analysis. Research Industry Society Innovations Issues Industry Society Research The topics included in the database, can also be used to link classes in dimensions. For example, the innovation Biocompatible robotics for revalidation links the business sector Manufacturing industry of electric motors, generators and transformers to the research area Biomedical Engineering. Because of the number of topics, statistically important relations between classes can be identified: Relating classes internal to a specific dimension, e.g. business to business. Relating classes in different dimensions, e.g. business to research. SESSION 4 TALES FROM THE FRONTIER 111

112 But besides the possible relations between classes, also an initial agenda can be set up. The topics that are common to both classes are actually possible areas of co-operation. Within topics, a distinction between innovations and issues is made. This enables a cross correlation between the demand and supply side of our society. Industry The system provides for example an answer to the Innovations Issues question: What kinds of innovations are related to the issue of obesities. Also, a overview can be given which kind of business and research organisations are Research relevant. This is a very powerful functionality, because specific clients can make use of the core data to search for action to solve their specific issues, including an overview of possible important stakeholders. The ability to make a cross reference between classes within a dimension enables a valuable additional output. A project specific dimension can be added, for example an organisational structure of a firm (units). If then all topics relevant are fingerprinted by this dimension, a cross correlation can be generated that shows the linkages between the organisational units. By doing this, two interesting outputs can be generated: 1. Initiation of innovation, or issues oriented reorganisation based on content. 2. Identification of possible areas of co-operation between the organisational units. At this moment, more than 1500 topics are included in the database. The objective is to increase this number 5-10 fold. With this large number of topics, strategic analysis can not be made due to the information overload. A clustering of topics is needed. Because of the fingerprinting, Primary topic b a Figure #: Schematic representation of correlation it is possible to correlate topics. The database can be asked to find innovations that are related to a specific innovation with specific related business, research classes and social needs (Themes). This clustering method is based on statistical analysis, using similarity index as means of correlation. For each dimension, the similarity index of a primary innovation (or topic in general) to a secondary innovation (or topic in general) is calculated, using the following method (Sorensen-Dice): SI = 2*a/((a+b) + (a+c)) Secondary topic c Where: SI: Similarity index a: the number of classes both in the primary and secondary topic (overlap) b: the number of classes unique to the primary topic c: the number of classes unique to the secondary topic To calculate the overall similarity index according to all dimensions, the geometric mean between all specific indexes of each dimension is calculated: SI total = 3 ( SI research * SI business * SI society ) SI total : The general similarity index SESSION 4 TALES FROM THE FRONTIER 112

113 SI research : Similarity index Research SI business : Similarity index business SI society : Similarity index Social needs In this way, clusters of topics are identified that have (more or less) similar business, research and societal characteristics. This means that clusters are identified where research, business can cooperate to develop new markets. The clustering mechanism can also be used for individual stakeholder searches. A user profile can be set up, fingerprinting their characteristics using the dimensions (business, research, societal needs). This profile can be used to find similar Research Industry Search profile Industry Innovations Research Society Society Issues Industry Society Research topics. Next to possible important topics, important types of organisations can be identified. This functionality enables a brokerage type of use. Organisations, or individual users can search in the database for new markets, possible innovations that may inspire them to new business opportunities. But next to this inspiring element, direct linkages with involved organisations are possible; networks, or even partnerships can be initiated. 3.4 Dynamo in practice: Internet and MS-Access Access to Dynamo 2004 The project Dynamo 2004 has been ended in April 2004, with a successful presentation during the Dynamo Theme Day Conference. The public data can be accessed using: Username: tnoguest; Password: guest The content and structure is still under construction. The number of innovations included will still increase over the next months, due to additional acceptance. Also the website interface will be developed based on comments during the Theme day, like presenting information about involved organizations. SESSION 4 TALES FROM THE FRONTIER 113 Dynamo uses the Internet as a communication network for data entry, gate keeping and public output. A dedicated Internet server is present to provide the technological platform. The Internet interface primary functions are enabling third party input and provision of public output. With project dedicated login username, data can be included in the database (input account). The user will be able to enter and adjust data. Next to this type of account, a project dedicated login username for public output is available. This account enables straight forward outputs, limited cross correlation and limited individual search profiles. All information given is online data. Next to this public website, a Gatekeeper website is developed. This provides online access to the data and project management. Also specific functionalities to evaluate the data entered by third parties, like sending changes to the third parties based on history logs, cross correlation of data entered, facilitation of Internet evaluation, user management. The access of this website is limited to the Gatekeeper team. An MS-Access application is developed to make specialized data analysis. The MS-Access application is available using a password token. This enables an electronic snapshot of the database, including selected data. The application can be used to in dept analysis, making project dedicated queries. This enables functionalities, like: Including the topics characteristics in the analysis.

114 Relating issues and innovations. Using a project subset of the dimensions. Making overall cross correlation tables Introducing client specific monitoring functionalities for business opportunities. Facilitate generation of themes, including selection of dimensions as basis for correlation. However, using this application needs in depth knowledge of the database. 4. Results of Dynamo Introduction to Dynamo 2004 Although the Dynamo approach is general, it is applied in a periodic cycle. In 2004 the Dynamo approach is tested in an experimental, but representative environment with limited stakeholders. This first public trial and presentation of the Dynamo concept is called the Dynamo 2004 project 31 and the ambition was to strive for a good representation of the innovation at every research field and sector. Only innovations were gathered in this exercise. The objective of this trial project was to explore if the approach would be valuable and viable. The information gathered by the Dynamo 2004 exercise was limited. Two government funding agencies, NWO and Senter 32 used their public funded project portfolio as a source. Additional innovations based on projects and future studies by the Economic Affairs Office for Science & Technology and TNO was added. This represented a balance between innovation projects respectively from science and industry. The analysis of the information gathered was limited. Based on expert judgement and an internal project meeting, themes were identified and elaborated upon. No in-depth analysis were made because of the limited representation of the sources. On April 21st, 2004, representatives of science, research, institutes, business, branch organisations, intermediaries and government were invited to examine the first public results of the Dynamo process and the database. Parallel sessions were organised discussing various themes and improvements of the process and database. 4.2 Collection of information During the period of November 2003 and March 2004, within the framework of the Dynamo 2004 project some 600 innovations were fed to the database and gate kept. The input was based on the following sources: The Senter portfolio of funded projects, based on the programs EET, TS and EDI. The NWO portfolio of accepted projects in several research programs, based on the research councils (limited to 2002): Earth and Life Sciences; Chemical Sciences; Medical and Health Research; Physical Sciences; Physics; Technology 33. The TNO portfolio on co-financed industrial projects (limited to 50 innovations) The Technologies clés 2005 future study provided by the Economic Affairs Office for Science & Technology [CM International, 2000]. 31 Dynamo 2004 ran from August 2003 to April NWO : Netherlands organisation for Scientific Research, is the main subsidiser for scientific research. Senter, part of the Ministry of Economic Affairs, is the agency which is responsible for the execution of grant schemes on behalf of a range of Dutch ministries and is the main subsidiser for industrial innovation projects. 33 Executed by the Technology Foundation STW. SESSION 4 TALES FROM THE FRONTIER 114

115 Table #: Overview of innovations collected within the Dynamo 2004 project. Source Nature Research Natural sciences Medical Engineering science and technology Number Senter 248 Chemical Physical Biological TNO 50 Information Physical Biological NWO 165 Chemical Information Physical Biological TWA 70 Chemical Information, Communications Physical Biological Medicine Clinical Pharmacology Pharmaceutical biochemistry Manufacturing Chemical Industrial Material Manufacturing Chemical Industrial Interdisciplinary Civil Manufacturing Electrical Electronic Nature industry Manufacturing Food Chemical Electrical Electronic Machinery Food Electrical Electronic Machinery Transport Food Electrical Electronic computer Food Electrical Chemical Computer Electronic Other Transport Storage Post Telecom Post Telecom Transport The emphasis from the perspective of research is on Natural Sciences and Engineering and Technology. A specific source (NWO) gives much attention to Medical sciences. From the industrial perspective, Manufacturing is central. Some sources give attention to Transport and Telecom. There is little attention to Agriculture, Construction and other non technology industrial sectors. However, definitive conclusions can not be drawn, because this is the result of the biased collection of information (specific programs and limited literature). 4.3 Clustering in themes Based on the sources described above, an identification of themes was performed. This clustering process used the following steps: 1. Step 1: Automatic generation of clusters The correlation function was used as a starting point and generated 87 clusters (SI factor more than 0.2). 2. Step 2: Draft theme identification In the next step, expert judgement was used to sharpen the clusters into themes. The result was 40 draft themes and 8 residual clusters that did not have a clear focus. 3. Step 3: Sharpening of themes In a workshop, the draft themes were discussed, missing themes were identified, as well as overlap. Also the residual clusters were discussed. In the discussions the essential product/market combinations played a central role. 4. Step 4: Development of the final themes In the last step, the final 35 themes were described, based on the discussions in the workshop. 35 themes were identified. An overview has been given in table. SESSION 4 TALES FROM THE FRONTIER 115

116 Table #: Overview of identified themes Devices for computing and communication Plastics and Polymers Environmental technologies and management Coating and other surface technology Medical diagnostics Medical drugs and therapies Enabling technologies for life sciences Agricultural production and management Logistics management Infrastructural works Packaging technology Chemical conversion technology Mining of natural resources Metals and metal products Industrial biological technology (white biotech) Indoor climate systems ICT networks and infrastructures Transport safety and efficient automobiles Materials and construction testing Tools and methods for designing products and constructions Sensor technology Software for computing and communication Ship building and water transport Public energy systems Medical implants and transplantation technologies Industrial energy systems Food production technology Food preservation, quality and safety Micro- and nanoscale applications Building methods and concepts Building materials ICT services Industrial safety Industrial manufacture technology Industrial separation technology It can be concluded that the data entered within the project Dynamo 2004 had focus on these themes. It can not be concluded that these are the main themes for the coming decades. However, the can be used to e.g. inspiring new brokerage meetings. 4.4 The Dynamo theme day The objective of brokerage and identification of themes were tested during the Dynamo Theme day. Some 65 participants were present and represented industrial firms, RTOs, government, branche organisations and intermediary organisations. During the day, the Dynamo database was presented and some themes discussed. During the final sessions the added value of the Dynamo database and the underlying process was discussed. The conclusions of the Dynamo Theme are as follows: The added value of the Dynamo database was confirmed. Especially the brokerage element was regarded highly useful to: 1) inspire new co-operation. 2) inspire new business opportunities; 3) make connections between organisations. The information present in the database was considered added value to the existing knowledge of experts. The broad overview was confirmed, although it was considered scattered and incomplete. More sources (e.g. patent data) should be incorporated. The themes rang a bell and were useful, but the content of the themes (underlying innovations) were too fractional to give an overview of the major developments in the theme. However, it offered a useful starting point for discussions and was considered added value. The basis for policy implications was considered to be too small. Some suggestions were made for improvement of the Dynamo website. First the interface could be improved into a more intuitive approach, where different users could find the information the needed more quickly. A layman should be able to use the system, instead of the expert-interface now available. Also some remarks were given about the character of content, like the need for more firm oriented information. The approach of the system is still considered technology push. The information presented is about technological innovations and also market developments and organisational innovations are of importance. The demand pull side should also be developed. Although the comments on Dynamo were critical, many of the participants of the Dynamo Theme day offered to assist the further development. The added value was considered high, although progress must still be made. SESSION 4 TALES FROM THE FRONTIER 116

117 5. Conclusions 5.1 Collection Parallel to the Dynamo 2004 exercise, the database is (has been) used as a knowledge management system for more than half a dozen projects (see Table # and Annex B). At this moment almost 2000 topics are included in the database. Table #: Overview of projects using Dynamo as a core expert system, including a description of the data contributed. Project Nature Number of innovations (approx.) Dynamo 2003 General 400 Science Forward Look Food, environment, rural areas 250 Binco Biotech, ICT, nanotech, cognitive Dynamo 2004 General, with focus on the 500 Netherlands RADAR 2004 Electric and building Suspronet Product service systems 200 Manvis Manufacturing industry Number of issues (approx.) Total Approx Approx. 450 The collection element of the Dynamo approach proved to be crucial. The Dynamo 2004 exercise shows that the information collected was considered inspirational, but the fragmented and fractional character of the set of innovations limited the use and added value. Although the Dynamo 2004 subset is increased by the other projects, still the total set is too fragmented and not consistent enough to be used for systematic analysis. Some topics are needed to support useful conclusions at meso level. The overall procedure to have third parties make draft input and the gatekeeperteam to make the evaluation and editing proved to be adequate. The input of third parties need assistance, but is crucial to achieve economies of scale and commitment to the approach. The translation of science oriented projects to innovations proved to be difficult. This needs business oriented thinking and a broad overview on possible markets, which takes special experience and capacities. The conclusion is that this translation of science projects should be done by the gatekeeper team to make it more efficient. The more industrial projects can use a more direct procedure and the innovations entered can be gate kept directly. Although the quality of the gate keeping process proved to be sufficient, it can be improved by relating the input to existing input and make cross referential evaluation. These additional functionalities should be supported by the database. 5.2 Individual brokerage The Dynamo Theme Day showed the enormous need for a system that can relate organisations though innovations. The fractional and fragmented character did not inhibit the use of the system, but proved even in this pioneers stage of Dynamo approach to be useful. The essential elements were: The output of innovations instead of projects. The possibility to use individual search profiles to identify relevant innovations. The possibility to identify other organisations (industrial and research) that are working in similar fields. SESSION 4 TALES FROM THE FRONTIER 117

118 This functionality is mainly based on the Dynamo Internet website and can be accessed individually. However, the conclusion of the participants was that although in principle the functionality was there, the actual interface was not sufficient to stimulate broad use. It has to be developed into a more intuitive interface. 5.3 Dynamo as a policy tool One of the objectives of the Dynamo 2004 project was to facilitate a selection mechanism to identify important policy areas to be addressed by further future studies and policy. The Dynamo Theme Day showed that this can not be facilitated by the Dynamo approach. Because of the fragmented and fractional character of the information a selection of areas can not be justified. The innovations were too specific to cover all research and business fields. Although every entry was relevant it was concluded, by e.q. branche organisations, this did not represent the whole. To make the Dynamo approach more useful for policy, the data collected must be broadened. Furthermore, some functionalities proved to be potentially useful: Inspiring the innovation discussion with new areas. This functionality must be seen as a first point of departure to initiate new business and policy. An innovation platform to continuously identify possible new surfacing areas of innovation. Making project funding and momentum of innovation transparent to the public. An analysis platform for issue oriented innovation strategy development. Although this functionality is not developed yet, in the coming months it will be set up. The database does not yet allow detailed policy analysis at this moment, because of the limited basis of data. However, there is definitely potential and it can already be used for inspirational purposes. 5.4 Final conclusions The overall conclusion of the Dynamo 2004 project and other recent developments is that the Dynamo is a valuable approach, with much added value. Many stakeholders, governments, business and research, appreciated the approach, although the needs were not always the same. This proves that the first criterion of multi stakeholder and multi needs was met. The second criterion of Dynamic and flexible was also recognized. Data collected showed to have added value and could be translated to other projects. The economies of scale is possible, but needs involvement of other parties to enhance the quality of the date collected and outputs generated. At this moment, more than five projects are using the system and joining forces, therefore the conclusion of autonomous and appreciated can be considered true. However, further development is needed. The broadness of the data collected must be expanded. Also other functionalities will enhance the economies of scale. An important element is the improvement of the public website interface. The conclusions are: Broadening the data collection nationally and internationally is necessary to improve the added value of the system. The inspiration and brokerage functionality proved to be very valuable and is already beyond the critical mass. However, the website interface must be further developed to enable layman use. The policy analysis functionality is still to be further developed. More data is needed to broaden its use. The final conclusion is that the Dynamo approach has added value, but still needs significant improvement. SESSION 4 TALES FROM THE FRONTIER 118

119 5.5 Dynamo in the future The developments around the Dynamo approach are still ongoing. The ministry of Economic Affairs, together with TNO, Senter and NWO are discussing the further development. Also internationalization of the project is looked at. Also other projects use the Dynamo database, so the character of innovation platform is in place. New functionalities are also considered, like: Further development of the brokerage tool. How can this functionality be organized in a selfsustaining way? Linking Driving forces and barriers to policy measures: Dynamo as a policy tool. The use of the system to identify policy measures based on the issues and innovations need a classification of driving forces and barriers. Impact assessment: Economic impact, environmental impact and employer risk are examples of indicators that are linked to innovations and can have added value to the system. These new functionalities are further investigated. SESSION 4 TALES FROM THE FRONTIER 119

Paper 1 : QTIP: Quick Technology Intelligence Processes

Paper 1 : QTIP: Quick Technology Intelligence Processes Paper 1 : QTIP: Quick Technology Intelligence Processes Alan L. Porter 1 How long does it take to provide a particular Future-oriented Technology Analysis (FTA)? We traditionally perceived the answer calibrated

More information

Rapid Technology Intelligence Process (RTIP) Alan Porter

Rapid Technology Intelligence Process (RTIP) Alan Porter Rapid Technology Intelligence Process (RTIP) Alan Porter ACS March, 2005 A New Dawn in Managing Technology A. Management of Technology (MOT) has been largely intuitive B. Patent, R&D publication, and business

More information

An Intellectual Property Whitepaper by Katy Wood of Minesoft in association with Kogan Page

An Intellectual Property Whitepaper by Katy Wood of Minesoft in association with Kogan Page An Intellectual Property Whitepaper by Katy Wood of Minesoft in association with Kogan Page www.minesoft.com Competitive intelligence 3.3 Katy Wood at Minesoft reviews the techniques and tools for transforming

More information

COMMERCIAL INDUSTRY RESEARCH AND DEVELOPMENT BEST PRACTICES Richard Van Atta

COMMERCIAL INDUSTRY RESEARCH AND DEVELOPMENT BEST PRACTICES Richard Van Atta COMMERCIAL INDUSTRY RESEARCH AND DEVELOPMENT BEST PRACTICES Richard Van Atta The Problem Global competition has led major U.S. companies to fundamentally rethink their research and development practices.

More information

GUIDE TO SPEAKING POINTS:

GUIDE TO SPEAKING POINTS: GUIDE TO SPEAKING POINTS: The following presentation includes a set of speaking points that directly follow the text in the slide. The deck and speaking points can be used in two ways. As a learning tool

More information

Research on the Capability Maturity Model of Digital Library Knowledge. Management

Research on the Capability Maturity Model of Digital Library Knowledge. Management 2nd Information Technology and Mechatronics Engineering Conference (ITOEC 2016) Research on the Capability Maturity Model of Digital Library Knowledge Management Zhiyin Yang1 2,a,Ruibin Zhu1,b,Lina Zhang1,c*

More information

DIGITAL TRANSFORMATION LESSONS LEARNED FROM EARLY INITIATIVES

DIGITAL TRANSFORMATION LESSONS LEARNED FROM EARLY INITIATIVES DIGITAL TRANSFORMATION LESSONS LEARNED FROM EARLY INITIATIVES Produced by Sponsored by JUNE 2016 Contents Introduction.... 3 Key findings.... 4 1 Broad diversity of current projects and maturity levels

More information

Canada s Intellectual Property (IP) Strategy submission from Polytechnics Canada

Canada s Intellectual Property (IP) Strategy submission from Polytechnics Canada Canada s Intellectual Property (IP) Strategy submission from Polytechnics Canada 170715 Polytechnics Canada is a national association of Canada s leading polytechnics, colleges and institutes of technology,

More information

The role of universities in attaining regional competitiveness under adversity a research proposal

The role of universities in attaining regional competitiveness under adversity a research proposal The role of universities in attaining regional competitiveness under adversity a research proposal Abstract Cherie Courseault Trumbach Sandra J. Hartman Olof Lundberg This study examines the role of the

More information

High Performance Computing Systems and Scalable Networks for. Information Technology. Joint White Paper from the

High Performance Computing Systems and Scalable Networks for. Information Technology. Joint White Paper from the High Performance Computing Systems and Scalable Networks for Information Technology Joint White Paper from the Department of Computer Science and the Department of Electrical and Computer Engineering With

More information

Chapter 22. Technological Forecasting

Chapter 22. Technological Forecasting Chapter 22 Technological Forecasting Short Description Background Strategic Rationale & Implications Strengths & Advantages Weaknesses & Limitations Process for Applying Technique Summary Case Study: Bell

More information

The A.I. Revolution Begins With Augmented Intelligence. White Paper January 2018

The A.I. Revolution Begins With Augmented Intelligence. White Paper January 2018 White Paper January 2018 The A.I. Revolution Begins With Augmented Intelligence Steve Davis, Chief Technology Officer Aimee Lessard, Chief Analytics Officer 53% of companies believe that augmented intelligence

More information

Violent Intent Modeling System

Violent Intent Modeling System for the Violent Intent Modeling System April 25, 2008 Contact Point Dr. Jennifer O Connor Science Advisor, Human Factors Division Science and Technology Directorate Department of Homeland Security 202.254.6716

More information

Technologists and economists both think about the future sometimes, but they each have blind spots.

Technologists and economists both think about the future sometimes, but they each have blind spots. The Economics of Brain Simulations By Robin Hanson, April 20, 2006. Introduction Technologists and economists both think about the future sometimes, but they each have blind spots. Technologists think

More information

UW REGULATION Patents and Copyrights

UW REGULATION Patents and Copyrights UW REGULATION 3-641 Patents and Copyrights I. GENERAL INFORMATION The Vice President for Research and Economic Development is the University of Wyoming officer responsible for articulating policy and procedures

More information

TERMS OF REFERENCE FOR CONSULTANTS

TERMS OF REFERENCE FOR CONSULTANTS Strengthening Systems for Promoting Science, Technology, and Innovation (KSTA MON 51123) TERMS OF REFERENCE FOR CONSULTANTS 1. The Asian Development Bank (ADB) will engage 77 person-months of consulting

More information

New frontiers in the strategic use of patent information Dr. Victor Zhitomirsky PatAnalyse Ltd

New frontiers in the strategic use of patent information Dr. Victor Zhitomirsky PatAnalyse Ltd New frontiers in the strategic use of patent information Dr. Victor Zhitomirsky PatAnalyse Ltd 1 Summary PatAnalyse is in the business of delivering IP intelligence to its clients. We take responsibility

More information

GROUP OF SENIOR OFFICIALS ON GLOBAL RESEARCH INFRASTRUCTURES

GROUP OF SENIOR OFFICIALS ON GLOBAL RESEARCH INFRASTRUCTURES GROUP OF SENIOR OFFICIALS ON GLOBAL RESEARCH INFRASTRUCTURES GSO Framework Presented to the G7 Science Ministers Meeting Turin, 27-28 September 2017 22 ACTIVITIES - GSO FRAMEWORK GSO FRAMEWORK T he GSO

More information

A STUDY ON THE DOCUMENT INFORMATION SERVICE OF THE NATIONAL AGRICULTURAL LIBRARY FOR AGRICULTURAL SCI-TECH INNOVATION IN CHINA

A STUDY ON THE DOCUMENT INFORMATION SERVICE OF THE NATIONAL AGRICULTURAL LIBRARY FOR AGRICULTURAL SCI-TECH INNOVATION IN CHINA A STUDY ON THE DOCUMENT INFORMATION SERVICE OF THE NATIONAL AGRICULTURAL LIBRARY FOR AGRICULTURAL SCI-TECH INNOVATION IN CHINA Qian Xu *, Xianxue Meng Agricultural Information Institute of Chinese Academy

More information

Patent portfolio audits. Cost-effective IP management. Vashe Kanesarajah Manager, Europe & Asia Clarivate Analytics

Patent portfolio audits. Cost-effective IP management. Vashe Kanesarajah Manager, Europe & Asia Clarivate Analytics Patent portfolio audits Cost-effective IP management Vashe Kanesarajah Manager, Europe & Asia Clarivate Analytics Clarivate Analytics Patent portfolio audits 3 Introduction The world today is in a state

More information

DESIGN THINKING AND THE ENTERPRISE

DESIGN THINKING AND THE ENTERPRISE Renew-New DESIGN THINKING AND THE ENTERPRISE As a customer-centric organization, my telecom service provider routinely reaches out to me, as they do to other customers, to solicit my feedback on their

More information

Find and analyse the most relevant patents for your research

Find and analyse the most relevant patents for your research Derwent Innovation Find and analyse the most relevant patents for your research Powering the innovation lifecycle from idea to commercialisation The pace of technology change is unprecedented with new

More information

InSciTe Adaptive: Intelligent Technology Analysis Service Considering User Intention

InSciTe Adaptive: Intelligent Technology Analysis Service Considering User Intention InSciTe Adaptive: Intelligent Technology Analysis Service Considering User Intention Jinhyung Kim, Myunggwon Hwang, Do-Heon Jeong, Sa-Kwang Song, Hanmin Jung, Won-kyung Sung Korea Institute of Science

More information

The Institute for Communication Technology Management CTM. A Center of Excellence Marshall School of Business University of Southern California

The Institute for Communication Technology Management CTM. A Center of Excellence Marshall School of Business University of Southern California The Institute for Communication Technology Management CTM A Center of Excellence Marshall School of Business University of Southern California Technology is Changing Business New technologies appear every

More information

An Open Innovation Machine Through Rapid Technology Intelligence Processes

An Open Innovation Machine Through Rapid Technology Intelligence Processes An Open Innovation Machine Through Rapid Technology Intelligence Processes Paul Frey President Nils Newman Director, New Business Development Most innovations fail. And companies that don t innovate die.

More information

Arshad Mansoor, Sr. Vice President, Research & Development INNOVATION SCOUTS: EXPANDING EPRI S TECHNOLOGY INNOVATION NETWORK

Arshad Mansoor, Sr. Vice President, Research & Development INNOVATION SCOUTS: EXPANDING EPRI S TECHNOLOGY INNOVATION NETWORK RAC Briefing 2011-1 TO: FROM: SUBJECT: Research Advisory Committee Arshad Mansoor, Sr. Vice President, Research & Development INNOVATION SCOUTS: EXPANDING EPRI S TECHNOLOGY INNOVATION NETWORK Research

More information

Technology Leadership Course Descriptions

Technology Leadership Course Descriptions ENG BE 700 A1 Advanced Biomedical Design and Development (two semesters, eight credits) Significant advances in medical technology require a profound understanding of clinical needs, the engineering skills

More information

Towards a Magna Carta for Data

Towards a Magna Carta for Data Towards a Magna Carta for Data Expert Opinion Piece: Engineering and Computer Science Committee February 2017 Expert Opinion Piece: Engineering and Computer Science Committee Context Big Data is a frontier

More information

WHAT SMALL AND GROWING BUSINESSES NEED TO SCALE UP

WHAT SMALL AND GROWING BUSINESSES NEED TO SCALE UP WHAT SMALL AND GROWING BUSINESSES NEED TO SCALE UP The Case for Effective Technical Assistance March 2018 AUTHORS: Greg Coussa, Tej Dhami, Marina Kaneko, Cho Kim, Dominic Llewellyn, Misha Schmidt THANK

More information

Technology forecasting used in European Commission's policy designs is enhanced with Scopus and LexisNexis datasets

Technology forecasting used in European Commission's policy designs is enhanced with Scopus and LexisNexis datasets CASE STUDY Technology forecasting used in European Commission's policy designs is enhanced with Scopus and LexisNexis datasets EXECUTIVE SUMMARY The Joint Research Centre (JRC) is the European Commission's

More information

Patent Mining: Use of Data/Text Mining for Supporting Patent Retrieval and Analysis

Patent Mining: Use of Data/Text Mining for Supporting Patent Retrieval and Analysis Patent Mining: Use of Data/Text Mining for Supporting Patent Retrieval and Analysis by Chih-Ping Wei ( 魏志平 ), PhD Institute of Service Science and Institute of Technology Management National Tsing Hua

More information

Technology transfer industry shows gains

Technology transfer industry shows gains Technology transfer industry shows gains in patents filed and granted, university-created startups and commercial products; slippage in federal research funding cited Highlights of AUTM s Canadian Licensing

More information

THE STATE OF THE SOCIAL SCIENCE OF NANOSCIENCE. D. M. Berube, NCSU, Raleigh

THE STATE OF THE SOCIAL SCIENCE OF NANOSCIENCE. D. M. Berube, NCSU, Raleigh THE STATE OF THE SOCIAL SCIENCE OF NANOSCIENCE D. M. Berube, NCSU, Raleigh Some problems are wicked and sticky, two terms that describe big problems that are not resolvable by simple and traditional solutions.

More information

Insights into Mining. Incremental innovation. Is it the right approach for mining?

Insights into Mining. Incremental innovation. Is it the right approach for mining? Insights into Mining Issue #5 kpmg.ca/mining Welcome to Insights into Mining, a periodic e-newsletter focused on current topics relevant to the Mining Industry. KPMG s mining practice is committed to the

More information

2. What is Text Mining? There is no single definition of text mining. In general, text mining is a subdomain of data mining that primarily deals with

2. What is Text Mining? There is no single definition of text mining. In general, text mining is a subdomain of data mining that primarily deals with 1. Title Slide 1 2. What is Text Mining? There is no single definition of text mining. In general, text mining is a subdomain of data mining that primarily deals with textual documents rather than discrete

More information

Interoperable systems that are trusted and secure

Interoperable systems that are trusted and secure Government managers have critical needs for models and tools to shape, manage, and evaluate 21st century services. These needs present research opportunties for both information and social scientists,

More information

How Explainability is Driving the Future of Artificial Intelligence. A Kyndi White Paper

How Explainability is Driving the Future of Artificial Intelligence. A Kyndi White Paper How Explainability is Driving the Future of Artificial Intelligence A Kyndi White Paper 2 The term black box has long been used in science and engineering to denote technology systems and devices that

More information

Canadian Health Food Association. Pre-budget consultations in advance of the 2018 budget

Canadian Health Food Association. Pre-budget consultations in advance of the 2018 budget Canadian Health Food Association Submission to the House of Commons Standing Committee on Finance Pre-budget consultations in advance of the 2018 budget Executive Summary Every year, $7 billion is contributed

More information

EMBEDDING THE WARGAMES IN BROADER ANALYSIS

EMBEDDING THE WARGAMES IN BROADER ANALYSIS Chapter Four EMBEDDING THE WARGAMES IN BROADER ANALYSIS The annual wargame series (Winter and Summer) is part of an ongoing process of examining warfare in 2020 and beyond. Several other activities are

More information

Submission to the Productivity Commission inquiry into Intellectual Property Arrangements

Submission to the Productivity Commission inquiry into Intellectual Property Arrangements Submission to the Productivity Commission inquiry into Intellectual Property Arrangements DECEMBER 2015 Business Council of Australia December 2015 1 Contents About this submission 2 Key recommendations

More information

Expert Group Meeting on

Expert Group Meeting on Aide memoire Expert Group Meeting on Governing science, technology and innovation to achieve the targets of the Sustainable Development Goals and the aspirations of the African Union s Agenda 2063 2 and

More information

Toward the Desired State: Developing More Helpful Relationships between Districts and Technical Assistance Providers

Toward the Desired State: Developing More Helpful Relationships between Districts and Technical Assistance Providers December 2018 SCOPE ~ Practitioner Toolkit Toward the Desired State: Developing More Helpful Relationships between Districts and Technical Assistance Providers By Ann Jaquith with Jon Snyder Technical

More information

Accepting Equity When Licensing University Technology

Accepting Equity When Licensing University Technology University of California - Policy EquityLicensingTech Accepting Equity When Licensing University Technology Responsible Officer: SVP - Research Innovation & Entrepreneurship Responsible Office: RI - Research

More information

Why do Inventors Reference Papers and Patents in their Patent Applications?

Why do Inventors Reference Papers and Patents in their Patent Applications? Rowan University Rowan Digital Works Faculty Scholarship for the College of Science & Mathematics College of Science & Mathematics 2010 Why do Inventors Reference Papers and Patents in their Patent Applications?

More information

Accepting Equity When Licensing University Technology

Accepting Equity When Licensing University Technology University of California Policy Accepting Equity When Licensing University Technology Responsible Officer: VP - Research & Graduate Studies Responsible Office: RG - Research & Graduate Studies Issuance

More information

Defining analytics: a conceptual framework

Defining analytics: a conceptual framework Image David Castillo Dominici 123rf.com Defining analytics: a conceptual framework Analytics rapid emergence a decade ago created a great deal of corporate interest, as well as confusion regarding its

More information

- Innovation Mapping - White space Analysis for Biomaterials in Complex Patent Landscapes

- Innovation Mapping - White space Analysis for Biomaterials in Complex Patent Landscapes - Innovation Mapping - White space Analysis for Biomaterials in Complex Patent Landscapes Alan L. Porter, Georgia Tech alan.porter@isye.gatech.edu Michael Kayat, UTEK Corporation mkayat@utekcorp utekcorp.com

More information

Infrastructure for Systematic Innovation Enterprise

Infrastructure for Systematic Innovation Enterprise Valeri Souchkov ICG www.xtriz.com This article discusses why automation still fails to increase innovative capabilities of organizations and proposes a systematic innovation infrastructure to improve innovation

More information

Analogy Engine. November Jay Ulfelder. Mark Pipes. Quantitative Geo-Analyst

Analogy Engine. November Jay Ulfelder. Mark Pipes. Quantitative Geo-Analyst Analogy Engine November 2017 Jay Ulfelder Quantitative Geo-Analyst 202.656.6474 jay@koto.ai Mark Pipes Chief of Product Integration 202.750.4750 pipes@koto.ai PROPRIETARY INTRODUCTION Koto s Analogy Engine

More information

PhD Student Mentoring Committee Department of Electrical and Computer Engineering Rutgers, The State University of New Jersey

PhD Student Mentoring Committee Department of Electrical and Computer Engineering Rutgers, The State University of New Jersey PhD Student Mentoring Committee Department of Electrical and Computer Engineering Rutgers, The State University of New Jersey Some Mentoring Advice for PhD Students In completing a PhD program, your most

More information

Automating Patent Drafting

Automating Patent Drafting Automating Patent Drafting (DRAFT White paper June 29, 2017) AI + patent preparation: Specifio augments law firm patent practices with cutting-edge deep learning and natural language generation technologies.

More information

The Social Innovation Dynamic Frances Westley October, 2008

The Social Innovation Dynamic Frances Westley October, 2008 The Social Innovation Dynamic Frances Westley SiG@Waterloo October, 2008 Social innovation is an initiative, product or process or program that profoundly changes the basic routines, resource and authority

More information

Standards for High-Quality Research and Analysis C O R P O R A T I O N

Standards for High-Quality Research and Analysis C O R P O R A T I O N Standards for High-Quality Research and Analysis C O R P O R A T I O N Perpetuating RAND s Tradition of High-Quality Research and Analysis For more than 60 years, the name RAND has been synonymous with

More information

B222A. Management technology and innovation

B222A. Management technology and innovation B222A Management technology and innovation Unit Technology is represent source of Competitive advantages Growth for companies Consideration of multiple functions Challenge factors of Technological Management

More information

Executive Summary Industry s Responsibility in Promoting Responsible Development and Use:

Executive Summary Industry s Responsibility in Promoting Responsible Development and Use: Executive Summary Artificial Intelligence (AI) is a suite of technologies capable of learning, reasoning, adapting, and performing tasks in ways inspired by the human mind. With access to data and the

More information

Financial Results for the Fiscal Year Ended March 2018: Prepared Remarks by President Akio Toyoda

Financial Results for the Fiscal Year Ended March 2018: Prepared Remarks by President Akio Toyoda Financial Results for the Fiscal Year Ended March 2018: Prepared Remarks by President Akio Toyoda Hello everyone, and thank you very much for taking the trouble to join us today. First of all, I would

More information

UN Global Sustainable Development Report 2013 Annotated outline UN/DESA/DSD, New York, 5 February 2013 Note: This is a living document. Feedback welcome! Forewords... 1 Executive Summary... 1 I. Introduction...

More information

Guidelines for the Professional Evaluation of Digital Scholarship by Historians

Guidelines for the Professional Evaluation of Digital Scholarship by Historians Guidelines for the Professional Evaluation of Digital Scholarship by Historians American Historical Association Ad Hoc Committee on Professional Evaluation of Digital Scholarship by Historians May 2015

More information

Research on the Multi-league System Independent Innovation of Enterprises as the Mainstay

Research on the Multi-league System Independent Innovation of Enterprises as the Mainstay Research on the Multi-league System Independent Innovation of Enterprises as the Mainstay Hua Zou (Corresponding author) School of Management, Shen Yang University of Technology P.O.Box 714 Shenyang, Liaoning

More information

ty of solutions to the societal needs and problems. This perspective links the knowledge-base of the society with its problem-suite and may help

ty of solutions to the societal needs and problems. This perspective links the knowledge-base of the society with its problem-suite and may help SUMMARY Technological change is a central topic in the field of economics and management of innovation. This thesis proposes to combine the socio-technical and technoeconomic perspectives of technological

More information

Determine the Future of Lean Dr. Rupy Sawhney and Enrique Macias de Anda

Determine the Future of Lean Dr. Rupy Sawhney and Enrique Macias de Anda Determine the Future of Lean Dr. Rupy Sawhney and Enrique Macias de Anda One of the recent discussion trends in Lean circles and possibly a more relevant question regarding continuous improvement is what

More information

WORKSHOP ON BASIC RESEARCH: POLICY RELEVANT DEFINITIONS AND MEASUREMENT ISSUES PAPER. Holmenkollen Park Hotel, Oslo, Norway October 2001

WORKSHOP ON BASIC RESEARCH: POLICY RELEVANT DEFINITIONS AND MEASUREMENT ISSUES PAPER. Holmenkollen Park Hotel, Oslo, Norway October 2001 WORKSHOP ON BASIC RESEARCH: POLICY RELEVANT DEFINITIONS AND MEASUREMENT ISSUES PAPER Holmenkollen Park Hotel, Oslo, Norway 29-30 October 2001 Background 1. In their conclusions to the CSTP (Committee for

More information

A Knowledge-Centric Approach for Complex Systems. Chris R. Powell 1/29/2015

A Knowledge-Centric Approach for Complex Systems. Chris R. Powell 1/29/2015 A Knowledge-Centric Approach for Complex Systems Chris R. Powell 1/29/2015 Dr. Chris R. Powell, MBA 31 years experience in systems, hardware, and software engineering 17 years in commercial development

More information

California State University, Northridge Policy Statement on Inventions and Patents

California State University, Northridge Policy Statement on Inventions and Patents Approved by Research and Grants Committee April 20, 2001 Recommended for Adoption by Faculty Senate Executive Committee May 17, 2001 Revised to incorporate friendly amendments from Faculty Senate, September

More information

Office of Science and Technology Policy th Street Washington, DC 20502

Office of Science and Technology Policy th Street Washington, DC 20502 About IFT For more than 70 years, IFT has existed to advance the science of food. Our scientific society more than 17,000 members from more than 100 countries brings together food scientists and technologists

More information

Enhancing Government through the Transforming Application of Foresight

Enhancing Government through the Transforming Application of Foresight Addressing g the Future: Enhancing Government through the Transforming Application of Foresight Professor Ron Johnston Australian Centre for Innovation University of Sydney www.aciic.org.au Helsinki Institute

More information

The 26 th APEC Economic Leaders Meeting

The 26 th APEC Economic Leaders Meeting The 26 th APEC Economic Leaders Meeting PORT MORESBY, PAPUA NEW GUINEA 18 November 2018 The Chair s Era Kone Statement Harnessing Inclusive Opportunities, Embracing the Digital Future 1. The Statement

More information

Empirical Research Regarding the Importance of Digital Transformation for Romanian SMEs. Livia TOANCA 1

Empirical Research Regarding the Importance of Digital Transformation for Romanian SMEs. Livia TOANCA 1 Empirical Research Regarding the Importance of Digital Transformation for Romanian SMEs Livia TOANCA 1 ABSTRACT As the need for digital transformation becomes more and more self-evident with the rapid

More information

Combining scientometrics with patentmetrics for CTI service in R&D decisionmakings

Combining scientometrics with patentmetrics for CTI service in R&D decisionmakings Combining scientometrics with patentmetrics for CTI service in R&D decisionmakings ---- Practices and case study of National Science Library of CAS (NSLC) By: Xiwen Liu P. Jia, Y. Sun, H. Xu, S. Wang,

More information

TECHNOLOGY, ARTS AND MEDIA (TAM) CERTIFICATE PROPOSAL. November 6, 1999

TECHNOLOGY, ARTS AND MEDIA (TAM) CERTIFICATE PROPOSAL. November 6, 1999 TECHNOLOGY, ARTS AND MEDIA (TAM) CERTIFICATE PROPOSAL November 6, 1999 ABSTRACT A new age of networked information and communication is bringing together three elements -- the content of business, media,

More information

Call for contributions

Call for contributions Call for contributions FTA 1 2018 - Future in the Making F u t u r e - o r i e n t e d T e c h n o l o g y A n a l y s i s Are you developing new tools and frames to understand and experience the future?

More information

Your Law firm marketing

Your Law firm marketing Ten Opportunities to improve Your Law firm marketing Practical strategies you can use to grow your law practice. Your marketing strategy is the key to growing your law firm. If your marketing strategy

More information

Laboratory 1: Uncertainty Analysis

Laboratory 1: Uncertainty Analysis University of Alabama Department of Physics and Astronomy PH101 / LeClair May 26, 2014 Laboratory 1: Uncertainty Analysis Hypothesis: A statistical analysis including both mean and standard deviation can

More information

Innovative Approaches in Collaborative Planning

Innovative Approaches in Collaborative Planning Innovative Approaches in Collaborative Planning Lessons Learned from Public and Private Sector Roadmaps Jack Eisenhauer Senior Vice President September 17, 2009 Ross Brindle Program Director Energetics

More information

ADVANCING KNOWLEDGE. FOR CANADA S FUTURE Enabling excellence, building partnerships, connecting research to canadians SSHRC S STRATEGIC PLAN TO 2020

ADVANCING KNOWLEDGE. FOR CANADA S FUTURE Enabling excellence, building partnerships, connecting research to canadians SSHRC S STRATEGIC PLAN TO 2020 ADVANCING KNOWLEDGE FOR CANADA S FUTURE Enabling excellence, building partnerships, connecting research to canadians SSHRC S STRATEGIC PLAN TO 2020 Social sciences and humanities research addresses critical

More information

Message from the CEO. Kazuhiro Tsuga. Representative Director President CEO. Panasonic Annual Report 2018

Message from the CEO. Kazuhiro Tsuga. Representative Director President CEO. Panasonic Annual Report 2018 Message from the CEO Kazuhiro Tsuga Representative Director President CEO 09 Panasonic Annual Report 2018 Growth Strategy Foundation for Growth Results for Fiscal Year Ended March 2018 Pushing Forward

More information

Comments of the AMERICAN INTELLECTUAL PROPERTY LAW ASSOCIATION. Regarding

Comments of the AMERICAN INTELLECTUAL PROPERTY LAW ASSOCIATION. Regarding Comments of the AMERICAN INTELLECTUAL PROPERTY LAW ASSOCIATION Regarding THE ISSUES PAPER OF THE AUSTRALIAN ADVISORY COUNCIL ON INTELLECTUAL PROPERTY CONCERNING THE PATENTING OF BUSINESS SYSTEMS ISSUED

More information

A SYSTEMIC APPROACH TO KNOWLEDGE SOCIETY FORESIGHT. THE ROMANIAN CASE

A SYSTEMIC APPROACH TO KNOWLEDGE SOCIETY FORESIGHT. THE ROMANIAN CASE A SYSTEMIC APPROACH TO KNOWLEDGE SOCIETY FORESIGHT. THE ROMANIAN CASE Expert 1A Dan GROSU Executive Agency for Higher Education and Research Funding Abstract The paper presents issues related to a systemic

More information

Victor O. Matthews (Ph.D)

Victor O. Matthews (Ph.D) Victor O. Matthews (Ph.D) Department of Electrical/ Information Engineering CU EXECUTIVE ADVANCE 2016 ATTAINMENT OF VISION 10:2022 WHAT IS INNOVATION? CU EXECUTIVE ADVANCE 2016 ATTAINMENT OF VISION 10:2022

More information

Economic and Social Council

Economic and Social Council United Nations Economic and Social Council Distr.: General 11 February 2013 Original: English Economic Commission for Europe Sixty-fifth session Geneva, 9 11 April 2013 Item 3 of the provisional agenda

More information

COMPETITIVE ADVANTAGES AND MANAGEMENT CHALLENGES. by C.B. Tatum, Professor of Civil Engineering Stanford University, Stanford, CA , USA

COMPETITIVE ADVANTAGES AND MANAGEMENT CHALLENGES. by C.B. Tatum, Professor of Civil Engineering Stanford University, Stanford, CA , USA DESIGN AND CONST RUCTION AUTOMATION: COMPETITIVE ADVANTAGES AND MANAGEMENT CHALLENGES by C.B. Tatum, Professor of Civil Engineering Stanford University, Stanford, CA 94305-4020, USA Abstract Many new demands

More information

The 9 Sources of Innovation: Which to Use?

The 9 Sources of Innovation: Which to Use? The 9 Sources of Innovation: Which to Use? By Kevin Closson, Nerac Analyst Innovation is a topic fraught with controversy and conflicting viewpoints. Is innovation slowing? Is it as strong as ever? Is

More information

UNIT-III LIFE-CYCLE PHASES

UNIT-III LIFE-CYCLE PHASES INTRODUCTION: UNIT-III LIFE-CYCLE PHASES - If there is a well defined separation between research and development activities and production activities then the software is said to be in successful development

More information

Introduction to Foresight

Introduction to Foresight Introduction to Foresight Prepared for the project INNOVATIVE FORESIGHT PLANNING FOR BUSINESS DEVELOPMENT INTERREG IVb North Sea Programme By NIBR - Norwegian Institute for Urban and Regional Research

More information

SUCCESSION PLANNING. 10 Tips on Succession and Other Things I Wish I Knew When I Started to Practice Law. February 8, 2013

SUCCESSION PLANNING. 10 Tips on Succession and Other Things I Wish I Knew When I Started to Practice Law. February 8, 2013 SUCCESSION PLANNING 10 Tips on Succession and Other Things I Wish I Knew When I Started to Practice Law February 8, 2013 10 Tips on Succession Planning and Other Things I Wish I Knew When I Started to

More information

Public Sector Future Scenarios

Public Sector Future Scenarios Public Sector Future Scenarios Two main scenarios have been generated as a result of the scenario building exercise that took place in the context of the SONNETS project, as follows: Probable Scenario

More information

2018 NISO Calendar of Educational Events

2018 NISO Calendar of Educational Events 2018 NISO Calendar of Educational Events January January 10 - Webinar -- Annotation Practices and Tools in a Digital Environment Annotation tools can be of tremendous value to students and to scholars.

More information

Intellectual Property

Intellectual Property Intellectual Property Technology Transfer and Intellectual Property Principles in the Conduct of Biomedical Research Frank Grassler, J.D. VP For Technology Development Office for Technology Development

More information

MORE POWER TO THE ENERGY AND UTILITIES BUSINESS, FROM AI.

MORE POWER TO THE ENERGY AND UTILITIES BUSINESS, FROM AI. MORE POWER TO THE ENERGY AND UTILITIES BUSINESS, FROM AI www.infosys.com/aimaturity The current utility business model is under pressure from multiple fronts customers, prices, competitors, regulators,

More information

Gerald G. Boyd, Tom D. Anderson, David W. Geiser

Gerald G. Boyd, Tom D. Anderson, David W. Geiser THE ENVIRONMENTAL MANAGEMENT PROGRAM USES PERFORMANCE MEASURES FOR SCIENCE AND TECHNOLOGY TO: FOCUS INVESTMENTS ON ACHIEVING CLEANUP GOALS; IMPROVE THE MANAGEMENT OF SCIENCE AND TECHNOLOGY; AND, EVALUATE

More information

Empirical Research on Systems Thinking and Practice in the Engineering Enterprise

Empirical Research on Systems Thinking and Practice in the Engineering Enterprise Empirical Research on Systems Thinking and Practice in the Engineering Enterprise Donna H. Rhodes Caroline T. Lamb Deborah J. Nightingale Massachusetts Institute of Technology April 2008 Topics Research

More information

University of Massachusetts Amherst Libraries. Digital Preservation Policy, Version 1.3

University of Massachusetts Amherst Libraries. Digital Preservation Policy, Version 1.3 University of Massachusetts Amherst Libraries Digital Preservation Policy, Version 1.3 Purpose: The University of Massachusetts Amherst Libraries Digital Preservation Policy establishes a framework to

More information

FOREST PRODUCTS: THE SHIFT TO DIGITAL ACCELERATES

FOREST PRODUCTS: THE SHIFT TO DIGITAL ACCELERATES FOREST PRODUCTS: THE SHIFT TO DIGITAL ACCELERATES INTRODUCTION While the digital revolution has transformed many industries, its impact on forest products companies has been relatively limited, as the

More information

Editorial Preface ix EDITORIAL PREFACE. Andrew D. Bailey, Jr. Audrey A. Gramling Sridhar Ramamoorti

Editorial Preface ix EDITORIAL PREFACE. Andrew D. Bailey, Jr. Audrey A. Gramling Sridhar Ramamoorti Editorial Preface ix EDITORIAL PREFACE Andrew D. Bailey, Jr. Audrey A. Gramling Sridhar Ramamoorti The task of the university is the creation of the future, so far as rational thought, and civilized modes

More information

Report to Congress regarding the Terrorism Information Awareness Program

Report to Congress regarding the Terrorism Information Awareness Program Report to Congress regarding the Terrorism Information Awareness Program In response to Consolidated Appropriations Resolution, 2003, Pub. L. No. 108-7, Division M, 111(b) Executive Summary May 20, 2003

More information

Role of Knowledge Economics as a Driving Force in Global World

Role of Knowledge Economics as a Driving Force in Global World American International Journal of Research in Humanities, Arts and Social Sciences Available online at http://www.iasir.net ISSN (Print): 2328-3734, ISSN (Online): 2328-3696, ISSN (CD-ROM): 2328-3688 AIJRHASS

More information

Strategic & managerial issues behind technological diversification

Strategic & managerial issues behind technological diversification Strategic & managerial issues behind technological diversification Felicia Fai DIMETIC, April 2011 Fai, DIMETIC, April 2011 1 Introduction Earlier, considered notion of core competences, & applied concept

More information

Development and Integration of Artificial Intelligence Technologies for Innovation Acceleration

Development and Integration of Artificial Intelligence Technologies for Innovation Acceleration Development and Integration of Artificial Intelligence Technologies for Innovation Acceleration Research Supervisor: Minoru Etoh (Professor, Open and Transdisciplinary Research Initiatives, Osaka University)

More information

Confidently Assess Risk Using Public Records Data with Scalable Automated Linking Technology (SALT)

Confidently Assess Risk Using Public Records Data with Scalable Automated Linking Technology (SALT) WHITE PAPER Linking Liens and Civil Judgments Data Confidently Assess Risk Using Public Records Data with Scalable Automated Linking Technology (SALT) Table of Contents Executive Summary... 3 Collecting

More information

CHAPTER 1 PURPOSES OF POST-SECONDARY EDUCATION

CHAPTER 1 PURPOSES OF POST-SECONDARY EDUCATION CHAPTER 1 PURPOSES OF POST-SECONDARY EDUCATION 1.1 It is important to stress the great significance of the post-secondary education sector (and more particularly of higher education) for Hong Kong today,

More information