You are here Biopharmaceutical / Genomic Glossary Homepage > Informatics > IT Infrastructure

 IT Infrastructure glossary & taxonomy
Evolving Terminologies for Emerging Technologies
Comments? Questions? Revisions?
Mary Chitty MSLS
mchitty@healthtech.com
Last revised July 08, 2019


Chemistry term index   Drug discovery term index   Informatics term index   Technologies term index    Biology term index 
Related glossaries include
Drug discovery & development  Genomics,    Proteomics
Informatics Algorithms & data analysis   Bioinformatics   Clinical informatics  Drug discovery informatics  Information management & interpretation 
Technologies: Combinatorial libraries & synthesis   Sequencing 
Biology DNA   Gene definitions  RNA  Protein Structures

agent: Definitions for autonomous agents, intelligent agents, user- agent. 

Amazon Web Services AWS: Today's life science organizations must deal with increasingly complex network, storage and computational requirements. Next-generation lab instruments and protocols are changing faster than the underlying research IT infrastructures built to support them. Operating an efficient, scalable and agile research IT infrastructure in the face of such rapid change is a complex challenge we all encounter. In late 2007 [Chris]  Dagdigian and his BioTeam colleagues realized that, without any managerial mandate, the whole group of consultants was independently experimenting with Amazon Web Services (AWS) to solve a customer problem. The cost of EC2 is ridiculously cheap, with almost infinite ways of controlling it. Bio-IT World Nov 18, 2009 http://www.bio-itworld.com/2009/11/18/c-word.html 

amorphous computing: 
refers to computational systems that use very large numbers of identical, parallel processors each having limited computational ability and local interactions. The term Amorphous Computing was coined at MIT in 1996 in a paper entitled "Amorphous Computing Manifesto" by Abelson, Knight, Sussman, et al. Wikipedia accessed 2018 Oct 25 https://en.wikipedia.org/wiki/Amorphous_computing

Amorphous Computing Homepage
, Artificial Intelligence, MIT, US http://groups.csail.mit.edu/mac/projects/amorphous/ 

autonomic computing:
https://en.wikipedia.org/wiki/Autonomic_computing  

Beowulf computing: Wikipedia http://en.wikipedia.org/wiki/Beowulf_(computing)   

bikeshed:  Why should I care? [metaphor] http://www.unixguide.net/freebsd/faq/16.19.shtml  Thanks to World Wide Words 

biomedical computational science and technology   The NIH is interested in promoting research and developments in computational science and technology that will support rapid progress in areas of scientific opportunity in biomedical research.  As defined here, biomedical computing or biomedical information science and technology includes database design, graphical interfaces, querying approaches, data retrieval, data visualization and manipulation, data integration through the development of integrated analytical tools, and tools for electronic collaboration, as well as computational and mathematical research including the development of structural, functional, integrative, and analytical models and simulations.  Program Announcement (PA) Number: PAR-09-218 https://grants.nih.gov/grants/guide/pa-files/PAR-09-218.html

Blockchain in Pharma, R&D, and HealthcareEmpowering the Networking Ecosystem 2019 April 17-18 Boston MS Blockchain is becoming increasingly adopted to address the perennial networking challenges of visibility, integrity, security, and speed for data management in pharma, R&D, and healthcare. Innovative personalized therapies are driving this shift of proprietary data-driven life sciences supply chains, from basic research to R&D to clinical trials to manufacturing to patient. http://www.bio-itworldexpo.com/blockchain

Cloud ComputingApplying Cloud for Expanding Applications 2019 April 17-18 Boston MA Cloud computing has become the platform enterprises turn to for their application analysis as well as data storage. Data-intensive life scientists from biological researchers to biopharmaceutical organizations realize this practicality and necessity. Thus, adoption has been greater than anyone expected and users continue to expand applications. Through case studies, explores the rapid growth and progressive maturation of cloud as well as evolving provider and user experiences. http://www.bio-itworldexpo.com/cloud-computing

Track 3: Cloud Computing

See also: Amazon Web Services AWS,  utility computing

computational linguistics: Information management & interpretation

computational video:  The study and application of the processing of streamed video data. This field of research is emerging from the convergence of two technologies: digital cameras and high performance computing and high bandwidth networks. In addition, past and current research in machine vision has provided some practical solutions to some of the fundamental processing problems inherent in processing video.  Institute for Information Technology, National Research Council, Canada, Research Programs Computational Video http://iit-iti.nrc-cnrc.gc.ca/templates/itiiit/itiiit2.cfm?CFID=33974&CFTOKEN=93356...  

compute farm: Related terms: compute server farm, ranch, server farm.  

computer virus: Virus Glossary of terms, McAfee,  100 + definitions, http://home.mcafee.com/VirusInfo/Glossary.aspx 

computers: Narrower terms include high performance computers, supercomputers

computing: Related terms include ASP Active Server Pages, compute farm, informatics, MPP Massively Parallel Processing, parallel processing, petaflop, teraflop, server farm, supercomputer.  Narrower terms: DCE Distributed Computing Environment, DNA computing, grid computing, high performance computing, molecular computing, molecular computing, quantum computing, soft computing, utility computing 

configurable:
 
To eliminate confusion and hopefully give the industry a common way to talk about these issues, LNS Research will use the following definitions for each of these terms:  Out-of-the-Box: Any functionality that comes shipped directly from the software vendor or can be configured easily (where “easily” means configured by a business, not IT user) with built-in workflow tools, templates, and/or best practices provided directly by the vendor.  Configurable: Any functionality that can be created using built-in workflow tools shipped by the vendor. To be considered configurable, functionality should be forward-compatible with future releases.  Customizable: Any functionality that is configured using built-in workflow tools shipped by the vendor, but may not be forward compatible with future releases. Also, other functionality not shipped directly from the vendor that cannot be created using built-in workflow tools shipped by the vendor. All customization has no guarantee of compatibility with future releases and contains the risk of being costly to maintain over time.   Understanding Out-of-the-Box vs. Configured vs. Customized Software Posted by Matthew Littlefield on Fri, Jan 30, 201  http://blog.lnsresearch.com/blog/bid/204226/Understanding-Out-of-the-Box-vs-Configured-vs-Customized-Software

Configurable gives users the chance to modify options, without expensive programming.

CORBA Common Object Request Broker Architecture: A set of core specifications proposed by the Object Management Group (OMG). CORBA is designed to be object- oriented.  

Common Object Request Broker Architecture, OMG's open, vendor- independent architecture and infrastructure that computer applications use to work together over networks. Using the standard protocol IIOP, a CORBA- based program from any vendor, on almost any computer, operating system, programming language, and network, can interoperate with a CORBA- based program from the same or another vendor, on almost any other computer, operating system, programming language, and network. CORBA FAQ, OMG, 1997- 2002 http://www.omg.org/gettingstarted/corbafaq.htm  Related terms: interoperability, object- oriented     

customizable: Customized implies programming and expense.   Compare configurable

Data & Storage Management: Infrastructure and data storage solutions to enable discovery and east data bloat 2018 April 16-18 Boston MA Is the burden of managing your data growing larger every day? Do you have a scalable and robust data management infrastructure in place to process, analyze, and store vast quantities of data according to your organization policies? Is your organization using new tools and analytical processes such as AI and deep learning that stress your supporting IT infrastructure beyond the expectations of system designers? Managing data has become a prevalent issue in the life sciences industry. Organizations are spending millions on systems and platforms to manage and store many types of data (e.g., experimental, operational, clinical) from many different disparate sources. The role of data engineering is critical in orchestrating, configuring, managing, and scaling solutions to manage the data bloat problem. http://www.bio-itworldexpo.com/data-storage

Data ComputingImproving Scale, Speed and Ease of Deployment 2019 April 17-18 Boston MA There is an increased demand in computing power from life science researchers and scientists tackling big data issues. To do their work, their storage and infrastructure must be able to scale to handle billions of data points and files efficiently. The Data Computing track will explore data computing resources and application deployment tools that are needed to process computational workflows and drive automation, advance analytics capabilities, reproduce software deployment, maximize application performance, and drive broad organizational decision processes. http://www.bio-itworldexpo.com/data-computing

Data management in the Cloud 2019 March 11-13 San Francisco CA  Integrating Data from Disparate Sources to Achieve the Goal of Personalized Medicine  Program Today, IT professionals are challenged with finding solutions to manage big data being generated at research labs, pharmaceutical companies and medical centers. In order to do this, one must have the compute power, storage solutions, and analytic capability to make the data clinically actionable. Data from disparate sources including omics (genomics, proteomics, metabolomics, etc.), imaging, and sensors must be integrated. Cambridge Healthtech Institute's 2nd Annual Data Management in the Cloud program will bring together key leaders in the fields of cloud architecture and data management to share case studies and to discuss the challenges and solutions they face in their centers.  Overall, this event is will offer practical solutions for network engineers, data architects, software engineers, etc. to build data ecosystems which enable the goal of personalized medicine.  https://www.bio-itworldexpowest.com/Converged-IT-and-the-Cloud
Track 12: Data Security

Edge: Intelligent Compute from Device to Cloud 2019 April 17-18 Boston MA Simply defined, edge can be considered where people and devices or things connect with the network. Compute-intensive edge applications, including Internet of Things (IOT), augmented reality (AR), and artificial intelligence (AI), interact with their environment based on ever-changing conditions. Where should this complex data be transferred, stored, and analyzed? During the Inaugural Edge track, data scientists share their real-world living-on-edge experiences of leveraging this shifting space to increasingly deliver on the promise of cloud and its growing complexity http://www.bio-itworldexpo.com/edge

also known as just “edge”—brings processing close to the data source, and it does not need to be sent to a remote cloud or other centralized systems for processing. By eliminating the distance and time it takes to send data to centralized sources, we can improve the speed and performance of data transport, as well as devices and applications on the edge.  Cisco, Edge Computing vs. Fog Computing: Definitions and Enterprise Applications https://www.cisco.com/c/en/us/solutions/enterprise-networks/edge-computing.html  

distributed computing paradigm in which computation is largely or completely performed on distributed device nodes known as smart devices or edge devices as opposed to primarily taking place in a centralized cloud environment. The eponymous "edge" refers to the geographic distribution of computing nodes in the network as Internet of Things devices, which are at the "edge" of an enterprisemetropolitan or other network. The motivation is to provide server resourcesdata analysis and artificial intelligence("ambient intelligence") closer to data collection sources and cyber-physical systems such as smart sensors and actuators.[1] Edge computing is seen as important in the realization of physical computingsmart citiesubiquitous computing and the Internet of Things.  Wikipedia accessed 2018 Dec. 10 https://en.wikipedia.org/wiki/Edge_computing

Related term: Internet of Things

FLOP: Floating point operations per second. A measure of how fast a computer is based on calculations per second. A floating point is a number representation consisting of a mantissa, an exponent, and an assumed radix. The number represented is M multiplied by R raised to the power of E (M*R^E) where R is the radix or base of the number system. (For example, 10 is the radix of the decimal system.) National Center for Supercomputing Applications, MetaComputer Glossary, Univ. of Illinois, Urbana- Champaign 1995 http://archive.ncsa.uiuc.edu/Cyberia/MetaComp/MetaGlossary.html   Related terms: petaflop, teraflop

Fog computing is a standard that defines how edge computing should work, and it facilitates the operation of compute, storage and networking services between end devices and cloud computing data centers. Additionally, many use fog as a jumping-off point for edge computing.  Cisco, Edge Computing vs. Fog Computing: Definitions and Enterprise Applications https://www.cisco.com/c/en/us/solutions/enterprise-networks/edge-computing.html  Related term: Internet of Things

geek: http://en.wikipedia.org/wiki/Geek  Compares with nerd but there are many nuances and variations.

Google Custom Search APIs and Tools Glossary  http://code.google.com/apis/customsearch/docs/glossary.html 

grid computing: Wikipedia http://en.wikipedia.org/wiki/Grid_computing  
Narrower term: desktop grids; Related terms:  utility grids, Information management & interpretation semantic grid   

Grid computing glossary, Israel Association of Grid Technologies  http://www.grid.org.il/?CategoryID=365 

Hadoop: The Apache™ Hadoop® project develops open-source software for reliable, scalable, distributed computing.  The Apache Hadoop software library is a framework that allows for the distributed processing of large data sets across clusters of computers using simple programming models. It is designed to scale up from single servers to thousands of machines, each offering local computation and storage. Rather than rely on hardware to deliver high-availability, the library itself is designed to detect and handle failures at the application layer, so delivering a highly-available service on top of a cluster of computers, each of which may be prone to failures. Hadoop http://hadoop.apache.org/

high performance computing: Weboepedia definition http://www.webopedia.com/TERM/H/High_Performance_Computing.html    Related terms: Distributed Computing Environment DCE, MPP Massively Parallel Processing, petaflop, supercomputers, teraflop  

Internet of Things (IoT):  the network of physical devices, vehicles, home appliances, and other items embedded with electronics, software, sensors, actuators, and connectivity which enables these things to connect, collect and exchange data.[1][2][3][4] IoT involves extending Internet connectivity beyond standard devices, such as desktops, laptops, smartphones and tablets, to any range of traditionally dumb or non-internet-enabled physical devices and everyday objects. Embedded with technology, these devices can communicate and interact over the Internet, and they can be remotely monitored and controlled. Wikipedia accessed 2018 Nov 1 https://en.wikipedia.org/wiki/Internet_of_things  

Kevin Lonergan at Information Age, a business-technology magazine, has referred to the terms surrounding IoT as a “terminology zoo”.[207] The lack of clear terminology is not “useful from a practical point of view” and a “source of confusion for the end user”.[207] A company operating in the IoT space could be working in anything related to sensor technology, networking, embedded systems, or analytics.[207] According to Lonergan, the term IoT was coined before smart phones, tablets, and devices as we know them today existed, and there is a long list of terms with varying degrees of overlap and technological convergence: Internet of things, Internet of everything (IoE), Internet of Goods (Supply Chain), industrial Internet, pervasive computing, pervasive sensing, ubiquitous computing, cyber-physical systems (CPS), wireless sensor networks (WSN), smart objects, digital twin, cyberobjects or avatars,[108] cooperating objects, machine to machine (M2M), ambient intelligence (AmI), Operational technology (OT), and information technology(IT).[207] Regarding IIoT, an industrial sub-field of IoT, the Industrial Internet Consortium's Vocabulary Task Group has created a "common and reusable vocabulary of terms"[208] to ensure "consistent terminology"[208][209] across publications issued by the Industrial Internet Consortium. IoT One has created an IoT Terms Database including a New Term Alert[210] to be notified when a new term is published. As of March 2017, this database aggregates 711 IoT-related terms, while keeping material "transparent and comprehensive."[211][212]https://en.wikipedia.org/wiki/Internet_of_things#Confusing_terminology  

legacy systems: Wikipedia http://en.wikipedia.org/wiki/Legacy_systems  

Linux:  a family of free and open-source software operating systems built around the Linux kernel. Typically, Linux is packaged in a form known as a Linux distribution (or distro for short) for both desktop and server use. The defining component of a Linux distribution is the Linux kernel,[11] an operating system kernel first released on September 17, 1991, by Linus Torvalds.[12][13][14] Many Linux distributions use the word "Linux" in their name. . 
Wikipedia  accessed 2018 Nov 1
http://en.wikipedia.org/wiki/Linux 

machine-understandable:  http://www.w3.org/DesignIssues/Semantic.html  
See also under metadata  

markup languages:  XML  eXtensible Markup Language; Bioengineering & Biomaterials BIOML Biopolymer Markup Language, MatML Materials Markup Language; Bioinformatics BSML Bioinformatic Sequence Markup Language; Cheminformatics CML Chemical Markup Language; Information management DAML DARPA Agent Markup Language, DAML + OIL; Drug discovery informatics  VRML Virtual Reality Modeling Language; Microarrays GEML Gene Expression Markup Language, MAGE-ML MicroArray and Gene Expression Markup Language

markup languages, standards core: Robin Cover, Core Standards for Markup Languages, 2002  http://xml.coverpages.org/coreStandards.html

metacomputer: A collection of computers held together by state- of- the- art technology and "balanced" so that, to the individual user, it looks and acts like a single computer. The constituent parts of the resulting "metacomputer" could be housed locally, or distributed between buildings, even continents. [MetaComputer HomePage, National Center for Supercomputing Applications, Univ. of Illinois Urbana- Champaign, US] http://archive.ncsa.uiuc.edu/Cyberia/MetaComp/MetaHome.html 
MetaComputer Glossary of Terms
http://archive.ncsa.uiuc.edu/Cyberia/MetaComp/MetaGlossary.html

metadata: Information management & interpretation

middleware:  Wikipedia  http://en.wikipedia.org/wiki/Middleware   Related term: DCE Distributed Computing Environment  

modularity:  Ensures that, for the particular task at hand, the data will be collected and stored in an appropriate manner - which differs greatly from one level of activity (simply gathering the raw data) to another (storing analyzed data) and from one type of high- throughput system to another. ... The best system is one that employs integration at those levels where it is an advantage but maintains enough modularity to ensure that (1) there are no major compromises regarding how any one type of data is handled and, (2) all the key elements in a researcher’s information system can be adjusted or updated independently.   Related terms: integration, interoperability 
Wikipedia http://en.wikipedia.org/wiki/Modularity    

molecular computers: Computers whose input, output and state transitions are carried out by biochemical interactions and reactions. MeSH 2003   

molecular computing:   Related terms: DNA computing, quantum computing. Or are any of these the same?  

Moore's Law: Intel co-founder Gordon Moore is a visionary. His prediction, popularly known as Moore's Law, states that the number of transistors on a chip will double about every two years. http://www.intel.com/technology/mooreslaw/   Wikipedia http://en.wikipedia.org/wiki/Moore's_law 

open source: Open source definition annotated http://www.opensource.org/docs/definition.php   
open source software:
  Wikipedia http://en.wikipedia.org/wiki/Open_source_software 
The Cathedral and the bazaar
, Eric Steven Raymond http://catb.org/~esr/writings/cathedral-bazaar/    

open source programming languages: http://opensourceforu.com/2017/03/most-popular-programming-languages/

peta: 1015 quadrillions. SI unit prefixes beyond peta are exa1018 (quintillions), zetta1021 (sextillions) and yotta1024 (septillions) Compare with prefixes for the smallest numbers: Ultrasensitivity atto, femto, micro, nano, pico, yocto, zepto 

petaflop: A petaflops computer is more powerful than all of the computers on today's Internet combined. If such a system incorporated a petabyte of memory, it could hold all 17 million books in the Library of Congress or several thousand years' worth of videotapes. To fabricate such a system today from the best price/ performance systems available requires up to 10 million processors and consumes more than one billion watts of power. Its cost would be approximately $25 billion dollars, and the supercomputer would fail every couple of minutes. The system would cover the flight decks of all existing Nimitz-class aircraft carriers or fill up most of the Empire State Building with its hardware. T. Sterling "In pursuit of a quadrillion operations per second" Insights, NASA, Apr. 1998  http://www.hq.nasa.gov/hpcc/insights/vol5/petaflop.htm    Related term: teraflop computing. Broader term: FLOP  

quantum computing:  is computing using quantum-mechanical phenomena, such as superposition and entanglement.[1] A quantum computer is a device that performs quantum computing. They are different from binary digital electronic computers based on transistors. Whereas common digital computing requires that the data be encoded into binary digits (bits), each of which is always in one of two definite states (0 or 1), quantum computation uses quantum bits, which can be in superpositions of states. A quantum Turing machine is a theoretical model of such a computer, and is also known as the universal quantum computer. The field of quantum computing was initiated by the work of Paul Benioff (de)[2] and Yuri Manin in 1980,[3] Richard Feynman in 1982,[4] and David Deutsch in 1985.[5]  ... As of 2018, the development of actual quantum computers is still in its infancy, but experiments have been carried out in which quantum computational operations were executed on a very small number of quantum bits.[7] Both practical and theoretical research continues, and many national governments and military agencies are funding quantum computing research in additional effort to develop quantum computers for civilian, business, trade, environmental and national security purposes, such as cryptanalysis.[8] A small 20-qubit quantum computer exists and is available for experiments via the IBM quantum experience project. Wikipedia accessed 2018 Feb 16 https://en.wikipedia.org/wiki/Quantum_computing 

one class of problems in which quantum computers have a significant speed advantage is the modeling of large molecules to understand specific interactions and chemical processes. The idea is to use quantum processors to create a quantum (as opposed to a digital) twin, or simulation, and model the quantum processes involved at the subatomic level. Pharmaceutical and chemical companies are already experimenting with the potential of quantum simulation to accelerate drug discovery and design molecules with fewer unintended side effects. Executives in these industries estimate that identifying new targets in this way could increase the rate of drug discovery by 5% to 10% and accelerate development times by 15% to 20%.  Boston Consulting Group, Coming Quantum Leap Computing, 2018 https://www.bcg.com/en-us/publications/2018/coming-quantum-leap-computing.aspx

Related terms: DNA computing, molecular computing, nanocomputer. Or are any of these the same?   

quantum information: http://en.wikipedia.org/wiki/Quantum_information

supercomputer: FOLDOC definition http://foldoc.org/supercomputer
Webopedia definition http://www.webopedia.com/TERM/S/supercomputer.html

Very fast computers. Often used for graphics, modeling or simulations. Related terms: high performance computing, petaflop, teraflop; Protein structure Blue gene 

teraFlop (Tflop): The development of massively parallel computers with teraflop speed and the mastering of the associated programming problems will clearly shape new computational solutions for biomedicine in coming years ... in the field of experimental structural biology. Techniques for the experimental determination of biological structure increasingly rely on advanced computational tools. X-ray crystallography, NMR structure determination, and single molecule electron microscopy all continue to make advances in capabilities following increases in computing power. Opportunities in Molecular Biomedicine in the Era of Teraflop Computing, March 3 & 4, 1999, Rockville, MD, NIH Resource for Macromolecular Modeling and Bioinformatics; Beckman Institute for Advanced Science and Technology, University of Illinois at Urbana- Champaign 10 12 floating point operations per second  (trillions).  Molecular Biomedicine in the Era of Teraflop Computing - DDDAS.org  Related term: petaflop computing. Broader term: FLOP  

utility computing:  {Chris] Dagdigian [of BioTeam]  tries hard not to use the term ‘cloud,’ preferring instead utility computing or simply “The C word.” “Amazon Web Services is the cloud,” said Dagdigian  Bio-IT World Nov 18, 2009 http://www.bio-itworld.com/2009/11/18/c-word.html 

Computing power on demand (similar to electricity). 

IT Infrastructure resources
FOLDOC Free On-line Dictionary of Computing, Denis Howe,   http://foldoc.org/ 
Gartner IT Glossary https://www.gartner.com/it-glossary/

IBM Terminology Website http://www-01.ibm.com/software/globalization/terminology/index.html  
Intel Glossary https://www.intel.com/content/www/us/en/support/topics/glossary.html

Jargon File 4.4.7, 2004  http://catb.org/esr/jargon/ 
McAfee Online Security Glossary 2003-2017 http://home.mcafee.com/VirusInfo/Glossary.aspx  
National Center for Supercomputing Applications, MetaComputer Glossary, Univ. of Illinois, Urbana- Champaign 1995  45 definitions. http://archive.ncsa.uiuc.edu/Cyberia/MetaComp/MetaGlossary.html
W3C Glossary & Dictionary, http://www.w3.org/2003/glossary/  2003-2010 
Weboepedia, Quinstreet   http://www.webopedia.com/
whatis.com Information Technology encyclopedia.  http://whatis.techtarget.com/

How to look for other unfamiliar  terms

 

Contact | Privacy Statement | Alphabetical Glossary List | Tips & glossary FAQs | Site Map