upload
Gartner, Inc.
産業: Consulting
Number of terms: 1807
Number of blossaries: 2
Company Profile:
Gartner delivers technology research to global technology business leaders to make informed decisions on key initiatives.
Data ops is the hub for collecting and distributing data, with a mandate to provide controlled access to systems of record for customer and marketing performance data, while protecting privacy, usage restrictions and data integrity.
Industry:Technology
The process of discovering meaningful correlations, patterns and trends by sifting through large amounts of data stored in repositories. Data mining employs pattern recognition technologies, as well as statistical and mathematical techniques.
Industry:Technology
Gartner defines data management and integration as the practices, architectural techniques and tools for achieving consistent access to and delivery of data across the spectrum of data subject areas and data structure types in the enterprise, to meet the data consumption requirements of all applications and business processes
Industry:Technology
Data loss protection (DLP) describes a set of technologies and inspection techniques used to classify information content contained within an object — such as a file, email, packet, application or data store — while at rest (in storage), in use (during an operation) or in transit (across a network). DLP tools are also have the ability to dynamically apply a policy — such as log, report, classify, relocate, tag and encrypt — and/or apply enterprise data rights management protections.
Industry:Technology
A file format developed for VisiCalc, the first electronic spreadsheet. Still used today as a means for transferring files to and from spreadsheets.
Industry:Technology
The discipline of data integration comprises the practices, architectural techniques and tools for achieving the consistent access and delivery of data across the spectrum of data subject areas and data structure types in the enterprise to meet the data consumption requirements of all applications and business processes. Data integration tools have traditionally been delivered via a set of related markets, with vendors in each market offering a specific style of data integration tool. In recent years, most of the activity has been within the ETL tool market. Markets for replication tools, data federation (EII) and other submarkets each included vendors offering tools optimized for a particular style of data integration, and periphery markets (such as data quality tools, adapters and data modeling tools) also overlapped with the data integration tool space. The result of all this historical fragmentation in the markets is the equally fragmented and complex way in which data integration is accomplished in large enterprises — different teams using different tools, with little consistency, lots of overlap and redundancy, and no common management or leverage of metadata. Technology buyers have been forced to acquire a portfolio of tools from multiple vendors to amass the capabilities necessary to address the full range of their data integration requirements. This situation is now changing, with the separate and distinct data integration tool submarkets converging at the vendor and technology levels. This is being driven by buyer demands as organizations realize they need to think about data integration holistically and have a common set of data integration capabilities they can use across the enterprise. It is also being driven by the actions of vendors, such as those in individual data integration submarkets organically expanding their capabilities into neighboring areas, as well as by acquisition activity that brings vendors from multiple submarkets together. The result is a market for complete data integration tools that address a range of different data integration styles and are based on common design tooling, metadata and runtime architecture.
Industry:Technology
Data dependency mapping products are software products that determine and report on the likelihood of achieving specified recovery targets, based on analyzing and correlating data from applications, databases, clusters, OSs, virtual systems, networking and storage replication mechanisms. These products operate on direct-attached storage (DAS), storage-area-network (SAN)-connected storage and network-attached storage (NAS) at the primary production and secondary recovery data centers.
Industry:Technology
A language used to describe the data model for a database, i.e., the names and access paths for the data and how they are interrelated. In some software products, the DDL describes the logical, not the physical, data. Other products use it to describe both.
Industry:Technology
Data deduplication is a form of compression that eliminates redundant data on a subfile level, improving storage utilization. In this process, only one copy of the data is stored; all the redundant data will be eliminated, leaving only a pointer to the previous copy of the data. Deduplication can significantly reduce the required disk space, since only the unique data is stored.
Industry:Technology
Data center storage encryption tools offer configuration, management and reporting for disparate encryption solutions that are available for encrypting data in enterprise data centers. The console manages the encryption infrastructure (key management, policy definition, enforcement and access control), and deploys policies and configurations to the component that performs the actual encryption. Key management is handled by the encryption tools or by an enterprise key manager, and data access is managed at the application or data storage layer.
Industry:Technology